Contribute to Open Source. Search issue labels to find the right project for you!

add a sanity check when generating the changelog and an empty log is returned


When there’s nothing new to release, the changelog command freaks out:

$ GITHUB_ACCESS_TOKEN=[token] yarn changelog release-1.1.1-beta3
yarn run v1.5.1
$ ts-node script/changelog/index.ts release-1.1.1-beta3
Unable to parse line, using the full message. Error: Unable to parse ''
    at parseCommitTitle (/Users/shiftkey/src/desktop/script/changelog/parser.ts:20:11)
    at Object.convertToChangelogFormat (/Users/shiftkey/src/desktop/script/changelog/parser.ts:72:22)
    at (/Users/shiftkey/src/desktop/script/changelog/run.ts:51:34)
    at <anonymous>
    at process._tickCallback (internal/process/next_tick.js:118:7)
  "[???] "
✨  Done in 0.48s.

This is because the git log command may return an empty string when it has no entries, which is returned as an array with one value:

<img width=“138” src=“”>

This skips the problematic split command completely.

Updated 22/03/2018 15:55

update ts-node to latest version


We’re using this in a few places, so I want to upgrade this on it’s own and ensure the two major version changes don’t break anything we’re relying on.

  • [x] testing on macOS
    • [x] yarn build:dev
    • [x] yarn build:prod
    • [x] yarn lint
    • [x] yarn test
    • [x] yarn package
    • [x] yarn generate-octicons
    • [x] yarn cli
    • [x] yarn changelog
    • [x] yarn draft-release
  • [ ] testing on Windows
    • [ ] yarn build:dev
    • [ ] yarn build:prod
    • [ ] yarn lint
    • [ ] yarn test
    • [ ] yarn package
    • [ ] yarn generate-octicons
    • [ ] yarn cli
    • [ ] yarn changelog
    • [ ] yarn draft-release
  • [ ] testing on Ubuntu
    • [ ] yarn build:dev
    • [ ] yarn build:prod
    • [ ] yarn lint
    • [ ] yarn test
    • [ ] yarn package
    • [ ] yarn generate-octicons
    • [ ] yarn cli
    • [ ] yarn changelog
    • [ ] yarn draft-release
Updated 22/03/2018 02:54

Try out AWS Code*


Investigate leaving Codeship for AWS CodeBuild, CodeDeploy and CodePipeline so that there is a complete continuous delivery pipeline that will automatically push freshly baked Docker containers to the ECS cluster.

Updated 22/03/2018 01:12

Make it possible to set the kernel used in `make html`


Right now if you run make html (the only convenient way to build not on RTD), the nbconvert machinery uses some default kernel (I think based on whatever name is in the notebook). Often that’s not desirable - for example I have a special conda environment just for the tutorials, and probably other users will too to various degrees.

So there should be some way to set what kernel is being used to execute the tutorials when you do make html. It’s already possible in the scripts/ script using the --kernel-name option. So we just need to get that into the Makefile machinery (which I think goes by way of in Sphinx to actually do the build). Probably the easiest thing is to use an environment variable trick - like TUTORIALS_KERNEL_NAME="blah" make html. But there might be some clever way to use the “current” environment, which might be just as good?

Updated 21/03/2018 21:04

Create CloudWatch Metric Backed Auto Scaling Spot Groups


We need some more scalability!

To that, we need:

  • [x] A TF representation of our CloudWatch Metric (basically our Nomad queue depth)
  • [x] A script which polls nomad for the queue depth and periodically publishes this value to the metric
  • [x] An updated server cloudinit script to run this script via cron
  • [x] A TF representation of an Autoscaling Group based which tracks this metric
  • [x] A strategy for bidding on spot instances (suggest we start by using current market price)


Updated 21/03/2018 20:08 2 Comments

Investigate moving non-Windows specific functionality into a separate library


With a view to extending the capabilities of generated apps and connecting them with other solutions, it seems appropriate to move some of the functionality that is currently included in the app into a separate, referenced assembly.

The purpose of this issue is to track investigation of this idea and see what can or should be separated.

Based on expectation about how UWP apps will be used as part of X-Plat solutions, I propose to leave the ViewModels in the main app project. This will help with consistency for apps built with CodeBehind and also help with the compatibility of existing templates. The main reasons for having ViewModels separate is to aid reuse and testability. Reuse isn’t expected to be an issue so coupling the VMs to Windows specifics isn’t an issue. The code inside the app will still be testable so that’s not a blocker either.

Updated 22/03/2018 13:29 3 Comments

Use pre-built Docker image


The build time can be significantly reduced if we pull a pre-built image from Docker Hub. make shell should pull an image from Docker Hub by default.

PRs should use a pre-built image, however when Travis builds master we should re-build the image first and push it to Docker Hub.

All of this should be transparent as it is all abstracted by the Makefile.

  • [x] Set up Docker Hub repository
  • [x] Modify Makefile to pull image from Docker Hub
  • [ ] Modify travis.yml to push new image when commit is made to master
  • [ ] Update documentation
Updated 20/03/2018 22:15

Combine multiple samples from an experiment into a matrix



New Issue Checklist

  • [x] The title is short and descriptive
  • [x] The issue contains an:
    • [ ] Idea (new feature, user story, etc)
    • [ ] Problem
  • [ ] You have explained the:
    • [ ] Context
    • [ ] Problem or idea
    • [ ] Solution or next step –>

      Problem or idea & context

If a user wants an entire experiment (i.e., provides an experiment accession number), we should provide this data in the form of a matrix rather than multiple files for individual samples. Once samples are either processed with SCAN or, in the case of submitter-processed data from ArrayExpress, sample_table.txt files are obtained, we’ll need to bind/join these samples together (as columns) to provide the experiment matrix to the user. (Right now, this only applies to microarray data, as we plan to use tximport at the experiment level.)

Solution or next step

Based on a quick, preliminary discussion with @Miserlou,pandas might be the way to go.

Updated 20/03/2018 19:27

relax regex rule to catch more merged PRs when generating changelog


This came up as part of #4234, and it looks like some PRs can link to issues using Fixes: #id (with the colon). This is valid syntax for auto-closing issues when the PR is merged:

<img width=“389” src=“”>

Our parser doesn’t handle this, so it was overlooked in the release notes generator:

$ git checkout master
Switched to branch 'master'
Your branch is up to date with 'origin/master'.

$ GITHUB_ACCESS_TOKEN=[token] yarn draft-release beta | grep \#4004

$ git checkout relax-regex-for-fixed-issue
Switched to branch 'relax-regex-for-fixed-issue'

$ GITHUB_ACCESS_TOKEN=[token] yarn draft-release beta | grep \#4004
    "[Fixed] Add visual indicator that a folder can be dropped on Desktop - #4004. Thanks @agisilaos!",
  • [x] extract this into something more testable
  • [x] throw a bunch of the documented keywords at it
Updated 22/03/2018 17:35 1 Comments

github backups


Create backups for the entire CICE Consortium github space (not just this CICE repo).

@apcraig and @dabail10 are already working on this. The plan is for NCAR to include it as part of its regular CESM github backup process.

Updated 19/03/2018 22:02

Inconsistence CircleCI biulds


@eefahy something to look into, I tend to see a lot of initial (and sometimes repeated) build failures that eventually pass with a rerun. I’m assuming this is maybe a timing issue with getting localstack running as the errors are 500 responses from the services. Should we extend the pause time there again or is there something else that may need some attention?

Updated 20/03/2018 21:23 1 Comments

shell.nix depends on build artifacts


Steps to reproduce:

  1. git clone --recursive git://
  2. cd ghc/hadrian
  3. git pull && git checkout master
  4. nix-shell --pure

Expected result: Successful build.

Actual result:

hadrian(master)$ nix-shell --pure
these derivations will be built:
building '/nix/store/lmpgkvswlkcgpcjdilbyh202vm8a5i5r-directory-'...
building '/nix/store/1b54g6nv8rw0kq3i5xjpcbhy88jvdnbr-xml-1.3.14.drv'...
Build with /nix/store/d57p5gzyld7b3y4irzv70bfa3d8xxwgx-ghc-8.2.2.
Build with /nix/store/d57p5gzyld7b3y4irzv70bfa3d8xxwgx-ghc-8.2.2.
unpacking sources
unpacking source archive /nix/store/khjm3agcb1y91925r78h3chaq56w9fi7-6wsbrbb60dzbfqf62c6sskwcijc3arjj-directory
unpacking sources
unpacking source archive /nix/store/dnx7hpl7wz81r3av4l09pqqhdqi4qpg6-xml-1.3.14.tar.gz
source root is xml-1.3.14
source root is 6wsbrbb60dzbfqf62c6sskwcijc3arjj-directory
setting SOURCE_DATE_EPOCH to timestamp 1424727750 of file xml-1.3.14/xml.cabal
patching sources
patching sources
setupCompileFlags: -package-db=/tmp/nix-build-xml-1.3.14.drv-0/package.conf.d -j1 -threaded
setupCompileFlags: -package-db=/tmp/nix-build-directory- -j1 -threaded
[1 of 1] Compiling Main             ( Setup.hs, /tmp/nix-build-directory- )
[1 of 1] Compiling Main             ( Setup.hs, /tmp/nix-build-xml-1.3.14.drv-0/Main.o )
Linking Setup ...
Linking Setup ...
configureFlags: --verbose --prefix=/nix/store/v3yazy78gy6kkpvrcn3schm8lc8y5dgl-xml-1.3.14 --libdir=$prefix/lib/$compiler --libsubdir=$pkgid --docdir=/nix/store/zdzgar7mwysym9dbai87fann20pbw75n-xml-1.3.14-doc/share/doc --with-gcc=gcc --package-db=/tmp/nix-build-xml-1.3.14.drv-0/package.conf.d --ghc-option=-optl=-Wl,-rpath=/nix/store/v3yazy78gy6kkpvrcn3schm8lc8y5dgl-xml-1.3.14/lib/ghc-8.2.2/xml-1.3.14 --ghc-option=-j1 --disable-split-objs --disable-library-profiling --disable-profiling --enable-shared --disable-coverage --enable-library-vanilla --enable-executable-dynamic --enable-tests --ghc-option=-split-sections
configureFlags: --verbose --prefix=/nix/store/59r1c7sfgph9j2xxsc1k3j6dwkm32ybk-directory- --libdir=$prefix/lib/$compiler --libsubdir=$pkgid --docdir=/nix/store/w0gkzrf5q6ng40vzyvdl5m8pvw9v9dxy-directory- --with-gcc=gcc --package-db=/tmp/nix-build-directory- --ghc-option=-optl=-Wl,-rpath=/nix/store/59r1c7sfgph9j2xxsc1k3j6dwkm32ybk-directory- --ghc-option=-j1 --disable-split-objs --disable-library-profiling --disable-profiling --enable-shared --disable-coverage --enable-library-vanilla --enable-executable-dynamic --enable-tests --ghc-option=-split-sections
Configuring directory-
Warning: The 'build-type' is 'Configure' but there is no 'configure' script.
You probably need to run 'autoreconf -i' to generate it.
Configuring xml-1.3.14...
Dependency base >=3 && <5: using base-
Dependency bytestring -any: using bytestring-
Dependency text -any: using text-
Dependency base >=4.5 && <4.12: using base-
Dependency directory -any: using directory-
Dependency filepath >=1.3 && <1.5: using filepath-1.4.2
Dependency time >=1.4 && <1.9: using time-
Dependency unix >=2.5.1 && <2.8: using unix-
Source component graph:
    component lib
    component test:test dependency lib
Configured component graph:
    component directory-
        include base-
        include time-
        include filepath-1.4.2-DyDAQ5oOwBVDvLxMoNLDxx
        include unix-
    component directory-
        include base-
        include directory-
        include filepath-1.4.2-DyDAQ5oOwBVDvLxMoNLDxx
        include time-
        include unix-
Linked component graph:
    unit directory-
        include base-
        include time-
        include filepath-1.4.2-DyDAQ5oOwBVDvLxMoNLDxx
        include unix-
    unit directory-
        include base-
        include directory-
        include filepath-1.4.2-DyDAQ5oOwBVDvLxMoNLDxx
        include time-
        include unix-
Ready component graph:
    definite directory-
        depends base-
        depends time-
        depends filepath-1.4.2-DyDAQ5oOwBVDvLxMoNLDxx
        depends unix-
    definite directory-
        depends base-
        depends directory-
        depends filepath-1.4.2-DyDAQ5oOwBVDvLxMoNLDxx
        depends time-
        depends unix-
Using Cabal- compiled by ghc-8.2
Using compiler: ghc-8.2.2
Using install prefix:
Executables installed in:
Libraries installed in:
Dynamic Libraries installed in:
Private executables installed in:
Data files installed in:
Documentation installed in:
Configuration files installed in:
No alex found
Using ar found on system at:
No c2hs found
No cpphs found
No doctest found
Using gcc version 6.4.0 given by user at:
Using ghc version 8.2.2 found on system at:
Using ghc-pkg version 8.2.2 found on system at:
No ghcjs found
No ghcjs-pkg found
No greencard found
Using haddock version 2.18.1 found on system at:
No happy found
Using haskell-suite found on system at: haskell-suite-dummy-location
Using haskell-suite-pkg found on system at: haskell-suite-pkg-dummy-location
No hmake found
Using hpc version 0.67 found on system at:
Using hsc2hs version 0.68.2 found on system at:
Using hscolour version 1.24 found on system at:
No jhc found
Using ld found on system at:
No lhc found
No lhc-pkg found
No pkg-config found
Using runghc version 8.2.2 found on system at:
Using strip version 2.28 found on system at:
Using tar found on system at:
No uhc found
Setup: configure script not found.
builder for '/nix/store/lmpgkvswlkcgpcjdilbyh202vm8a5i5r-directory-' failed with exit code 1
cannot build derivation '/nix/store/dck8p81zlij9qksglaw8hkp11msrm426-hadrian-': 1 dependencies couldn't be built
error: build of '/nix/store/dck8p81zlij9qksglaw8hkp11msrm426-hadrian-' failed

Notice that the build fails for a library shipped with GHC. In shell.nix it’s included as localPackage.


First, we need to run configurePhase of the default Nix derivation for GHC HEAD:

  1. cd ghc
  2. nix-shell '<nixpkgs>' -A haskell.compiler.ghcHEAD
  3. configurePhase
  4. exit

Now we can cd into ghc/hadrian and proceed normally, the build won’t fail.

The issue is that this step shouldn’t be required, we must have a self-contained shell.nix.

Updated 20/03/2018 10:31 11 Comments

Kubernetes Alerts for Admin Region


We need to have the same alerting infrastructure as in other control plane regions.

Kubelets are hanging, Bonds are failing for the admin region as well. With only 3 nodes the result is catastrophic.

  • Deploy kube-monitoring
  • Hook it into Grafana/Alertmanager
Updated 19/03/2018 12:16 4 Comments

Look into Human Connection


as per and .

Human Connection is an independent and non-profit online platform that combines a social network, a knowledge network and an action network. We go far beyond the usual comments section and provide useful tools to exchange, research and discuss ideas as well as the space to start realizing projects together.

Updated 18/03/2018 20:55

Organizations Abstract


Site admin needs ability to create organizations for a single user or group of users to create polls and surveys for as an organizational unit. Likely to represent real-world organizations. First iteration simply a layer to provide meaningful indexing and collecting of related users and data.

Updated 18/03/2018 00:01

Add User.js Download Option


User.js could be used if someone wants to keep some of the settings of an already existing profile.

The user.js has to be placed in the profiles folder and will be applied when Firefox starts (and every time Firefox starts). The settings are simply integrated into the existing prefs.js. To make changes to the settings in user.js through about:config the user would have to change or delete the user.js file.

This is my understanding anyways and what I’ve figured from experimenting with it.

Offering it as an alternative to the prefs.js (or explaining that one can rename the file to user.js, start FF and then delete user.js to keep already existing configuration) could be useful for some.

What do you think?

Updated 17/03/2018 22:20 1 Comments

[DNMY] Migrate to ASP.NET Core 2.1


Migrate the application to ASP.NET Core 2.1. Changes include: 1. Updating to .NET Core SDK 2.1.300. 1. Updating to ASP.NET Core 2.1.0. 1. Using some new ASP.NET Core idioms (such as setting the compatibility level). 1. Using Microsoft.Extensions.Http (HttpClientFactory) for use of HttpClient. 1. Using Microsoft.AspNetCore.Mvc.Testing for integration tests.

Updated 18/03/2018 20:08 1 Comments

Migrate to myhpom-service user


Migrate MyHpom builds to use the new myhpom-service user instead of xdci-service. To see the secret UID, login to any MyHpom machine and getent passwd myhpom-service

Both xdci-service and myhpom-service are members of the same “service accounts” group (group name actually has embedded space, go figure). This should have no effect on this issue.

Updated 16/03/2018 18:59

Unknown suppression TreatAsOutOfBox


2667 ported #2649 but in the uwp6.1 branch I get the following failure that we don’t in Master…

E:\A_work\2035\s\wcf\Tools\Packaging.targets(1103,5): error : Unknown suppression TreatAsOutOfBox [E:\A_work\2035\s\wcf\src\System.ServiceModel.Http\pkg\System.ServiceModel.Http.pkgproj]

The uwp6.1 branch was snapped from the uwp6.0 branch which was in turn snapped from Master back in summer, so in many ways it is quite far behind Master such as buildtools versions and other dependencies.

@ericstj and @joperezr do you have any idea what I might need for this suppression to be understood?

Updated 19/03/2018 19:58 10 Comments

Setup relationships between bounded contexts on deployments


Every bounded context backend should have one environment variable telling which topic for receiving things: KAFKA_BOUNDED_CONTEXT_TOPIC

and another for variable, semicolon (;) separated list of topics to send to: KAFKA_BOUNDED_CONTEXT_SEND_TOPICS

With this configured properly, inter-bounded-context communication should be in place.


  • [ ] Identify relationships between bounded contexts
  • [ ] Configure YML deployments for Kubernetes with correct environment variables
Updated 22/03/2018 09:13

GUI for test server (main.js) settings


The test server (npm start) supports lots of settings by manually modifying the URL. It would be nice to have an actual GUI with checkboxes for boolean options (e.g. displayMode) and text boxes for others (e.g. adding macros). I think it would also make sense to always keep the URL in and with the current state, instead of requiring a press of a Permalink button.

Previously discussed in #1193.

Updated 17/03/2018 17:17 1 Comments

Add vim tests


The test cover at the moment only the python part but leave out the vim_main function. This is a bit harder to test (launch vim on a file and pass commands / a macro to execute which calls overleaf-commenter in a specific position, saves, and exists, and then checks the output file).

Updated 15/03/2018 20:57

Fork me on GitHub