Contribute to Open Source. Search issue labels to find the right project for you!



Better name is required.

important things:

  • googleablility
  • short and catchy?
  • easy pronounceable
  • can understand It provides LDAP Solution.

current name: LdapAndA is:

  • dual mean LDAP & A / LDAP-Panda
  • these two means are no means.
Updated 19/08/2017 12:39 2 Comments

Feedback dump API timed out for huge number of feedback records


As one of the REST API of feedback-portal service is to take the dump of feedback records for a given date and store them into the elasticsearch database.

During the process of taking dump of all feedback records for a given date using ( due to huge number of records server is timed out hence It couldn’t store the feedback records into elasticsearch database this issue leads to mis match of total feedback records in feedback-portal service and in elasticsearch database

for the following dates feedback records as follows DATE Records 31-03-2017 156 10-04-2017 122 11-04-2017 95 12-04-2017 540 Total 913 records Total records on feedback service as of on 18-April-2017 —-> 4073 Total feedback records on elasticsearch database as of 18-April-2017 —> 3160 (4073-913)

Ideally records on both the services should be the same due this issue there is a mismatch

Updated 19/08/2017 13:04 2 Comments

Improve installation script


The script, which I’ve only tested on Ubuntu 1404 and 1604 (fresh VMs), is a pretty bare bones installation script that does something like:

  • apt-get dependencies
  • install, verify and build libsodium
  • set up a virtualenv and install all the dependencies into it

There’s tons of ways it could be improved, but this is not really my skillset so I’d love to get some help. Some points:

  • Installs libsodium into system - this should be localized somehow, as I understand it.
  • Doesn’t install PyQt, see note here about interaction with virtualenv. This is only needed if someone wants to run the GUI from command line, low priority.
  • Check that it behaves properly on all error conditions
  • Is there a better alternative to gpg --import-ownertrust which means jedisct1’s key is fully trusted? Perhaps just delete it after?
  • Can the script automatically put the user into the virtualenv? Instructions now tell user to enter virtualenv after install, which is fine but a bit annoying.
  • An obvious one but the biggest one: changes required to make it work on a broader set of OSes (I don’t know which linuces it’ll play nice on right now).

  • probably several other things I haven’t thought of.

Updated 19/08/2017 12:00

Wayland support


On GNOME with Wayland, as seen on Debian Buster, GdkX11.X11Window.foreign_new_for_display throws TypeError: argument display: Expected GdkX11.X11Display, but got __gi__.GdkWaylandDisplay so the title bar isn’t hidden, because GdkX11.X11Display.get_default() returned a Wayland display, not an X11 display.

Updated 19/08/2017 11:50

Separate sensor.['armWarn'] for ARMAWAY and ARHHOME


For each sensor in the configuration file, there is a [‘armWarn’] element Wiki says:

armWarn: Boolean. Can be set to false if you wish to exclude the sensor from being checked when arming a zone with tripped sensors. Default value is true.

As it works now, if armWarn is set to true for a sensor you will get a warning telling you that the sensor is tripped when arming a zone. Furthermore, if canArmWithTrippedSensors is set to true for the zone, the arming attempt is refused.

That also means that if armWarn is set to false for a sensor you won’t get a warning telling you if the sensor is tripped when arming a zone. Furthermore, even if canArmWithTrippedSensors is set to false, the sensor’s armWarn setting overrides the zone’s canArmWithTrippedSensors and arming can be done.

My thoughts today are the following:

  1. Is the sensor configuration element armWarn named in a way that can be misleading? Because the setting certainly also affects not only warnings but also the ability to block an arming attempt (e.g. bypass the zone’s canArmWithTrippedSensors setting) Suggestions from native English speakers for a better naming of the armWarn element are welcomed!

  2. Should we have different armWarn settings for “Arming Home” and “Arming Away”? For example, I have a bedroom window that I wan’t to keep a bit open at night when I normally Arm Home. I don’t wish to have any warning, nor blocking when I arm Home. However if I Arm Away, I’d certainly like to block an arming attempt.

Updated 19/08/2017 11:48

Implement a Bubble Plot


Using xChart create an interface that enables easy rendering of a bubble plot from a tablesaw table. See the implementation of scatter plots for a very similar example. The task is basically to add a new chart type for BubblePlot, with a constructor that takes exactly three columns of numeric data (subclasses of NumericColumn)

Updated 19/08/2017 11:34

Implement a pie chart


Using xChart or JavaFX Charts create an interface that enables easy rendering of the plot from a tablesaw table. See the implementation of Bar Plots (which use JavaFX) for a very similar example.

Updated 19/08/2017 11:27



因 issue #23 引发的一个需求,目前解码和编码webm给小程序使用时,存在一些问题。




Updated 19/08/2017 11:33 1 Comments

Is triton compatible with pypy?


Hi I want to use triton with pypy, and set up a virtual environment using pypy as the interpreter. After compiling and install triton in the virtual environment, I execute import triton. But it reports that ImportError: No module named triton. However I can find in the site-packages directory in the environment.

If I do not use pypy as the default interpreter in the virtual environment, import triton succeeds. My question is how to solve this problem and thanks very much.

Updated 19/08/2017 19:14

Add classes for steam api respones


For better IDE helping and easier api, respones from steam should be packed into classes with known named fields etc.

Probably all serious api requests should return such classes.

To break backwards compatilibty it should be first added as a option, then deprecated then finally all parsing should be done.

Updated 19/08/2017 12:39 7 Comments

NSApplication Notification Handling


Apparently it’s anything but trivial to send notifications via macOS' system APIs when the sender isn’t a “normal” application. Workarounds are possible, albeit rather hacky and cumbersome/impossible from Swift.

There’s gotta be a way 🤔

“Possible solutions”

  • Fork mac-notification-sys into a Swift package with the same idea
  • Wrap terminal-notifier
  • …?
Updated 19/08/2017 21:58 1 Comments

Renaming Composer package ?


When I created this composer package, I created it with the name of imal-h/pdf-box but within few hours I got a issue opened by a java developer saying that pdf-box is a trade mark and I should not use it, so I had to rename the repo to PDFLib.

now the problem is people still have to use

composer require imal-h/pdf-box

And i’m thinking should I abandon the this package and should I recreate a new composer package with the name of imal-h/pdflib

Unfortunately this may lose all the download count and with that credibility goes down too.. let me know your thoughts

Updated 19/08/2017 10:18

Use meta on self defined objects


From @lal-s on August 18, 2017 15:18

I want to add meta data to my attributes. For example, I have the following classes:

class User < Dry::Struct
  attribute :name, Types::String.meta(my_meta: 'meta'
  attribute :address, Address
  attribute :accounts, Types::Array.member(Account).meta(resource: 'my resource')

class Address < Dry::Struct
  attribute :street, Types::String
  attribute :city, Types::String

class Account < Dry::Struct
  attribute :account_num, Types::Int

I want to be able to add meta data to attribute address in User, is there a way to do it? i.e

attribute :address, Address.meta(my_meta: 'meta')

Probably if I can define it differently eg: attribute :address, Types::Address.meta(my_meta: 'meta') just so that it lets me use meta on it.

Copied from original issue: dry-rb/dry-types#209

Updated 19/08/2017 19:39 4 Comments

xSQLServer: Integration test should also run tests using SqlServer module


Details of the scenario you tried and the problem that is occurring: Integration tests should run tests using the SqlServer PowerShell module on top of testing using SQLPS PowerShell module.

This is relevant to issue #509 as well.

The DSC configuration that is using the resource (as detailed as possible): n/a

Version of the Operating System, SQL Server and PowerShell the DSC Target Node is running: n/a

What module (SqlServer or SQLPS) and which version of the module the DSC Target Node is running: SqlServer

Version of the DSC module you’re using, or ‘dev’ if you’re using current dev branch: Dev

Updated 19/08/2017 09:01

Uncaught SyntaxError: Unexpected token < in JSON at position 0


[Enter steps to reproduce:]

Atom: 1.19.2 x64 Electron: 1.6.9 OS: Ubuntu 16.04.3 Thrown From: ftp-remote-edit package 0.11.11

Stack Trace

Uncaught SyntaxError: Unexpected token < in JSON at position 0

At file:///usr/share/atom/resources/app/static/index.html:1

SyntaxError: Unexpected token < in JSON at position 0
    at JSON.parse (<anonymous>)
    at ConfigurationView.loadConfig (/packages/ftp-remote-edit/lib/views/configuration-view.js:416:26)
    at ConfigurationView.reload (/packages/ftp-remote-edit/lib/views/configuration-view.js:456:31)
    at FtpRemoteEdit.editServers (/packages/ftp-remote-edit/lib/ftp-remote-edit.js:182:28)
    at HTMLElement.ftpRemoteEditEditServers (/packages/ftp-remote-edit/lib/ftp-remote-edit.js:39:50)
    at CommandRegistry.module.exports.CommandRegistry.handleCommandEvent (/usr/share/atom/resources/app/src/command-registry.js:265:35)
    at CommandRegistry.handleCommandEvent (/usr/share/atom/resources/app/src/command-registry.js:3:65)
    at CommandRegistry.module.exports.CommandRegistry.dispatch (/usr/share/atom/resources/app/src/command-registry.js:166:25)
    at AtomEnvironment.module.exports.AtomEnvironment.dispatchApplicationMenuCommand (/usr/share/atom/resources/app/src/atom-environment.js:1338:34)
    at EventEmitter.outerCallback (/usr/share/atom/resources/app/src/application-delegate.js:334:31)
    at emitThree (events.js:116:13)
    at EventEmitter.emit (events.js:194:7)


     -0:13.3.0 ftp-remote-edit:edit-servers (atom-workspace.workspace.scrollbars-visible-always.theme-one-dark-syntax.theme-one-dark-ui)
     -0:09.7.0 core:confirm (input.hidden-input)
     -0:03.1.0 ftp-remote-edit:edit-servers (

Non-Core Packages

autocomplete-php 0.3.7 
ftp-remote-edit 0.11.11 
language-volt 0.2.2 
Updated 19/08/2017 12:17 2 Comments

10.2 Ready! Let's start testing.


Hey @Windows-XAML/template-10-developer-community

Great news. What you find in Master today is the near-finished version of 10.2. You read that right, we’re close. There are some things I have not added yet because of RS3, but you can test this on Creators Update with Visual Studio 2017 v15.2 or v15.3 (which is brand new).

What to test.

Blank and Minimal are all we have working at this point. But they should be working. We’re defaulting to MvvmLight at this point, and you will see it’s going to be simple to provide guidance to use any other DI, MVVM framework, or message aggregator for teams that need that.

Then what?

Getting this wrapped in Nuget packages and an updated VSIX will be no small trick. But we won’t start toward that until we get testing in place. We’re also, finally, setup for unit testing - though I have not added any of them yet. We’re also setup to be portable to Xamarin.

Pull requests?

Yes. Be sure you are submitting for Master and be sure to flag your PR with 10.2 or something so I know that you know what you are fixing. And if you want to create an issue, use some type of 10.2 prefix in the name so I can keep those categorized correctly.


I hope so. I know I am. It’s been a lot of work, a lot of late nights, a lot of refactoring over and over. If we need to do some big fixes or some major refactoring, I am not afraid to do that at this point. In the meanwhile, I think what you see in the repo today is going to be close to the real thing.

Updated 20/08/2017 07:36 8 Comments



During DPC @clue and I had a long chat about how to handle body parsers without loosing the current flexibility. We came to the conclusion that middlewares are the way to go. The following proposal is inspired by the WIP PSR-15 but doesn’t implement it (I’ll get back on that later in this PR).

Suggested reading order

This PR contains a lot of changes, how ever most of those changes are in examples and tests. Here is a recommended reading order:

  2. MiddlewareInterface.php
  3. MiddlewareStackInterface.php
  4. MiddlewareStack.php
  5. Server.php
  6. Middleware/Callback.php
  7. Middleware/LimitHandlers.php
  8. Middleware/Buffer.php
  9. The rest

Major changes

Where 0.7 only requires you to pass a callable to handle incoming requests, this PR proposes to use middleware for that. (While still leaving the callable way intact by magically wrapping it.) One way to setup middleware by passing an array with middlewares implementing MiddlewareInterface:

$server = new Server([
    new Buffer(),
    new Callback(function ($request) {
        return new Response(200);

Or by passing in a concrete implementation of MiddlewareStackInterface:

$server = new Server(new MiddlewareStack([
    new Buffer(),
    new Callback(function (ServerRequestInterface $request) {
        return new Response(200);

The latter is done automatically when doing the former internally in the server. But you can create your own middleware stack implementation and use that instead.

Included middlewares

This PR includes three middlewares Buffer, Callback, and LimitHandlers.


Buffers the request body until it is fully in or when it reach the given size:

$middlewares = [
    new Buffer(),


Handles request just like 0.7 using a callback:

$middlewares = [
    new Callback(function (ServerRequestInterface $request) {
        return new Response(200);


Limits the number of concurrent request being handled at the same time:

$middlewares = [
    new LimitHandlers(10),

Body parsing

Currently one of our main issues with body parsing is the memory usage. But with the LimitHandlers we can pause the incoming request stream until we’re ready handle it thus limiting the amount of memory needed to handle request. And with Buffer we can first stream the body in, parse it with for example BodyParser, and then hand it to the next middleware on the stack.


While the middleware implementation in this PR is inspired by the work done for the coming PSR-15 it doesn’t implement it due to a small but major different. This implementation relies on promises where PSR-15 always assumes a response to be returned. An adapter for PSR-15 is beyond the scope of this PR but not for a Friends of ReactPHP packages and thus I’ve created for/http-middleware-psr15-adapter that utilizes RecoilPHP and on the fly PHP parsing and rewriting of PSR-15 middlewares.

With that package you can wrap the adapter around a PSR-15 as follows:

$middlewares = [
    new PSR15Middleware(
        [/** Constructor arguments */]

*Note: I’ve added last section to show how easy it can be to add PSR-15 middlewares, with the rewriting there is no guarantee for success

Updated 19/08/2017 21:27 6 Comments

memory leak on py34-

import gc
from pprint import pprint
import ffilupa

lua = ffilupa.LuaRuntime()
del lua

on python3.5+, after garbage collection, no LuaRuntime object reminds on python3.4-, after garbage collection, LuaRuntime object reminds, referred by <cdata 'void *' handle to <ffilupa.LuaRuntime object>>

Updated 19/08/2017 06:20

Need more community resources


I’m looking for help gathering some information and writing various guides/community resources. It would be nice if we could get some documents on things like resume templates, job boards/sites, personal project tips, portfolio building, etc…

Updated 19/08/2017 05:02

Overall Token Structure and MySQL DB structure evaluation


Current server setup, when user successfully logged into the app or signup the app, the server will generate a token using JWT. Currently, there is no expiration time for this token.

Current Workflow: user logs in or signs up, server generates token using JWT and stores it in login session table along with the userId.

LoginSession table keeps track of login session. Need to check the efficiency and reliability of this set up.

Also need to discuss if the current DB structure is doable and relatively efficient.

Updated 19/08/2017 16:59

Trainer Professions New setup

  • [ ] Trainer Professions

The Core and Database Trainer Profession setup is changing how it’s read from the trainer to the database. The trainer table is separated with also Trainer greeting table and Locales table as well, with SpellId, RequiredSkill, Cost, RequiredLevel

But right now it’s empty and I need either the community help to use the wow parser and wow sniffer offered by Trinity and sniff out all the trainer in the world, right now I have manually created a system that works to gather the information needed.

I will update as I go.

Updated 19/08/2017 03:11

Implement IntoIterator for Events


The standard library typically implements IntoIterator\<Item = T> for Collection\<T> and IntoIterator\<Item = &T> for &Collection\<T>. In mio, Events has the second but not the first.

The owned iterator can be more convenient in some situations, like putting into a struct.

struct Bad {
    events: mio::Events,
    iter: mio::event::Iter<'??>,

struct Good {
    iter: mio::event::IntoIter,
Updated 19/08/2017 05:25

Implement Index<usize> for Events


The mio::Events type looks a lot like Vec - it has a function get(usize) -> Option\<Event> which returns None if the index is out of bounds, and it has IntoIterator. It would be convenient to implement Index to enable square-bracket indexing that panics on out of bounds.

println!("{:?}", events[0]);
Updated 19/08/2017 05:25

New Contributors/Maintainers


From @karbassi on August 11, 2017 20:57

I’d like to start off this community fork on a strong foot to allow for open dialog and easy management.


  • [ ] There should be a Code of Conduct created for all to follow. (#217)
  • [ ] We should follow some sort of git branch model.
  • [ ] There should be only 1 or 2 people who have merge permissions into stable branches. (see below)
  • [ ] All changes should go through a pull request, even from project maintainers. (see below)
  • [ ] All pull requests must have tests, which pass via a CI tool (travis-ci has been already been set up).

Branch Model

My thoughts are to have master as the latest released version and develop as the latest staged version. This would allow the team to have a “beta” on develop if need be.

Merge permissions

It would be great to have one or two individuals who are the gatekeepers with merging permissions. I’d like to have everyone commit and be part of the project, but only a few who are in charge of making sure the PRs are valid, tests have been ran and passed, and the PR is for the best of the project.

Please add your thoughts and comments.

Copied from original issue: todotxt/todo.txt-cli-fork#9

Updated 20/08/2017 02:30 2 Comments



The README is lacking a few things…

  • [ ] Convert to Markdown.
  • [ ] Add Logo.
  • [ ] Add image/gif of it running in terminal.
  • [ ] Update travis-ci badge.
  • [ ] Add installation instructions for OS X/MacOS, Windows, and Linux
  • [ ] Add Code of Conduct link. (via #216)
  • [x] Add Contributing link. (via #210)
  • [ ] Add Changelog link.
Updated 20/08/2017 02:33

Deduplicate law tests


For some data types and laws we have to run the same tests multiple times. For example, Id has both Monad and Comonad, and each have laws that require testing for Functor. This effectively means the same batch of tests gets run twice.

Deduplication should be simple if the tests are not immediately run, and are instead aggregated in a set where their name string is their identity.

Updated 19/08/2017 00:50

Update yargs


The newest versions stopped supporting node <4 but it’s still the best argument parser so we should stick with it but we should update to the last version they released with node 0.10 support.

Updated 19/08/2017 00:22

Replace chalk


I’m thinking we should replace chalk with individual ansi-* modules from @jonschlinkert:

Updated 19/08/2017 00:21

jQuery... why?



First thank you for this nice plugin 👍 !

But adding jQuery to my projects only for this plugin make me really sad… So I was wondering if it’s possible to make it dependency free?

jQuery is not necessary to make the job and with modern JS syntax I’m sure I should be an easy job.

Don’t you think?

Updated 19/08/2017 05:02 7 Comments

Error and Hup triggered even when not registered


On linux, if I register a socket without error or hup interests, e.g.:

poll.register(&socket, Token(0), Ready::readable(), PollOpt::level());

Then I still receive events with error and hup set.

I sometimes receive Error and Hup set in addition to Readable:

Event { kind: Ready {Readable | Error | Hup}, token: Token(42) }

But sometimes, I also receive Error alone:

Event { kind: Ready {Error}, token: Token(29) }

This is annoying especially in level mode, because if we handle the events like this:

let ready = event.readiness();
if (event.is_readable()) {
    // …
if (event.is_writable()) {
    // ...
// Error or Hup not handled because not registered, but still triggered

Then the event is not handled, and is triggered again and again in a live loop.

Is it expected?

Updated 19/08/2017 05:25

openpose.bin does not seem to take advantage of multiple GPU's


Issue summary

Executing ./build/examples/openpose/openpose.bin does not seem to take advantage of multiple GPU’s even tho multiple GPUs are detected as indicated by the output message: “Auto-detecting GPUs… Detected 2 GPU(s), using them all.”

Regardless if one GPU or two GPU’s are used, the processing for a single image in the image_dir is the same. Approximately: 4.1 - 4.2 seconds.

I noted in the following readme that multiple GPU’s are for training, only? Is that still the case?

Currently Multi-GPU is only supported via the C/C++ paths and only for training.

Note: I’m using a Google Compute instance with NVIDIA Tesla K80 GPU’s.

Executed command (if any)

./build/examples/openpose/openpose.bin –image_dir images –write_keypoint /var/www/html/images -write_keypoint_format xml –keypoint_scale 3 –no_display –render_pose 0

I’ve also tried specifying ‘-num_gpu 2’ versus ‘-num_gpu 1’.

(I’m only retrieving the keypoint data and not generating an output image.)

OpenPose output (if any)

Starting pose estimation demo. Auto-detecting GPUs… Detected 2 GPU(s), using them all. Starting thread(s) Real-time pose estimation demo successfully finished. Total time: 4.150646 seconds.

Type of issue

  • Help wanted

Your system configuration

Operating system (lsb_release -a in Ubuntu): No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 16.04.2 LTS Release: 16.04 Codename: xenial

CUDA version (cat /usr/local/cuda/version.txt in most cases): CUDA Version 8.0.61

cuDNN version: (cat /usr/local/cuda/include/cudnn.h | grep CUDNN_MAJOR -A 2)

define CUDNN_MAJOR 5

define CUDNN_MINOR 1


GPU model (nvidia-smi in Ubuntu): +—————————————————————————–+ | NVIDIA-SMI 375.66 Driver Version: 375.66 | |——————————-+———————-+———————-+ | GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC | | Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. | |===============================+======================+======================| | 0 Tesla K80 Off | 0000:00:04.0 Off | 0 | | N/A 29C P8 28W / 149W | 15MiB / 11439MiB | 0% Default | +——————————-+———————-+———————-+ | 1 Tesla K80 Off | 0000:00:05.0 Off | 0 | | N/A 29C P8 28W / 149W | 0MiB / 11439MiB | 0% Default | +——————————-+———————-+———————-+

+—————————————————————————–+ | Processes: GPU Memory | | GPU PID Type Process name Usage | |=============================================================================| | 0 1976 G /usr/lib/xorg/Xorg 15MiB | +—————————————————————————–+

Caffe version: Default from OpenPose or custom version. Default from OpenPose.

OpenCV version:, installed with apt-get install libopencv-dev (Ubuntu)

Generation mode (only for Ubuntu): Makefile + Makefile.config (default, Ubuntu)

Compiler (gcc --version in Ubuntu): gcc (Ubuntu 5.4.0-6ubuntu1~16.04.4) 5.4.0 20160609

Updated 20/08/2017 07:11 3 Comments

Unable to parse compound selector where ID follows class


If I try to parse the selector:


I get the following error:

Error: invalid syntax at line 1 col 8:
Unexpected "#"

    at Parser.feed (/{PATH}/node_modules/nearley/lib/nearley.js:320:23)
    at Object.parse (/{PATH}/node_modules/scalpel/dist/createParser.js:24:26)
    at repl:1:3
    at ContextifyScript.Script.runInThisContext (vm.js:44:33)
    at REPLServer.defaultEval (repl.js:239:29)
    at bound (domain.js:301:14)
    at REPLServer.runBound [as eval] (domain.js:314:12)
    at REPLServer.onLine (repl.js:433:10)
    at emitOne (events.js:120:20)
    at REPLServer.emit (events.js:210:7)

It works fine if I reverse the ordering of the selectors, e.g., Both are valid and should be parsed successfully.

Updated 20/08/2017 02:23 4 Comments

rig project sync does not set working directory


All shell executions by a rig project * command should set the working directory to the directory of the config file.

In this way a developer could export the location of a config file in a terminal session, then run commands relevant to the project regardless of their actual working directory.

An example of this can be seen around

Updated 18/08/2017 21:38

Invalid react/no-direct-mutation-state in constructor


In the following special case, the react/no-direct-mutation-state rule throw an invalid error

class OneComponent extends Component {
  contructor() {

    class AnotherComponent extends Component {
      constructor() {

    this.state = {};

A better way is to not declare a component into another component constructor but that is an invalid linter error .

Updated 19/08/2017 22:43 3 Comments

Should error / warn when using import or require


These days there are lots of ways to pull in a lodash function. I would expect each of these to generate errors.

import forEach from 'lodash/forEach'
import whatever from 'lodash/forEach'

import { forEach } from 'lodash'
import { forEach as whatever } from 'lodash'

import forEach from 'lodash.foreach'
import whatever from 'lodash.foreach'

const forEach = require('lodash/forEach')
const whatever = require('lodash/forEach')

const { forEach } = require('lodash')
const { forEach: whatever } = require('lodash')

const forEach = require('lodash.foreach')
const whatever = require('lodash.foreach')

I was surprised that this wasn’t generating errors in my project. Did I miss something?

Updated 19/08/2017 23:01 1 Comments

Is v0.15.0 compatible with PHPUnit 5.7.21?


I’m using PHPUnit 5.7.21 with PHP5.6.31 and haven’t been able to run tests. When executing I’m getting the following messages:

PHP Warning: require(/var/www/docs/myapp/application/tests/third_party/PHP-Parser-3.0.3/lib/bootstrap.php): failed to open stream: No such file or directory in /var/www/docs/myapp/application/tests/Bootstrap.php on line 25 PHP Fatal error: require(): Failed opening required ‘/var/www/docs/myapp/application/tests/third_party/PHP-Parser-3.0.3/lib/bootstrap.php’ (include_path=‘.:/usr/share/pear:/usr/share/php’) in /var/www/docs/myapp/application/tests/Bootstrap.php on line 25

I don’t have composer and cannot install it since am not the server admin, so I followed the instructions to set ci-phpunit-test manually.

Am I missing anything?


Updated 19/08/2017 05:09 5 Comments

Error Using interact_user_followers on RPI3


Receiving an error when attempting to use the interact_user_followers method to only like 3 posts from a specified users followers. Running on Raspberry Pi 3.

dont_include, self.username, self.follow_restrict, random)
  File "/home/pi/Projects/InstaPy/instapy/", line 286, in get_given_user_followers
    dialog = browser.find_element_by_xpath('/html/body/div[4]/div/div[2]/div/div[2]/div/div[2]')
  File "/usr/local/lib/python2.7/dist-packages/selenium/webdriver/remote/", line 293, in find_element_by_xpath
    return self.find_element(by=By.XPATH, value=xpath)
  File "/usr/local/lib/python2.7/dist-packages/selenium/webdriver/remote/", line 752, in find_element
    'value': value})['value']
  File "/usr/local/lib/python2.7/dist-packages/selenium/webdriver/remote/", line 236, in execute
  File "/usr/local/lib/python2.7/dist-packages/selenium/webdriver/remote/", line 192, in check_response
    raise exception_class(message, screen, stacktrace)
selenium.common.exceptions.NoSuchElementException: Message: Unable to locate element: {"method":"xpath","selector":"/html/body/div[4]/div/div[2]/div/div[2]/div/div[2]"}
    at FirefoxDriver.prototype.findElementInternal_ (file:///tmp/tmpiWGE8x/extensions/
    at fxdriver.Timer.prototype.setTimeout/<.notify (file:///tmp/tmpiWGE8x/extensions/

my file below

from instapy import InstaPy

insta_username = 'username'
insta_password = 'password'

# if you want to run this script on a server, 
# simply add nogui=True to the InstaPy() constructor

InstaPy(username=insta_username, password=insta_password, use_firefox=True, page_delay=25)\
  .set_dont_include(['jonnykmedia', 'yogabuttar', 'mtnlovejh'])\
  .set_dont_like(['sex', 'nsfw'])\
  .set_user_interact(amount=3, random=False, percentage=100)\
  .set_do_follow(enabled=False, percentage=70)\
  .set_do_like(enabled=True, percentage=100)\
  .set_comments(["Cool", "Super!"])\
  .set_do_comment(enabled=False, percentage=33)\
  .interact_user_followers(['armadaskis'], amount=300, random=False)\
Updated 18/08/2017 21:03

Add signature validation to Xbe loading


Luke proposed the following idea: “ Someone should add signature validation to Cxbx, it’s not that hard. We already have all the crypto functions required That way we can show in log files or xbe dumped if it’s not a legit xbe All homebrew will be marked as non legit, but all commercial games will be legit, unless pirated Also important, most homebrew shares the same title id, usually 0 And most homebrew also has the xbe signature nulled out too But sometimes signed with a debug key, and sometimes a completely new key used specifically for homebrew ”

Updated 18/08/2017 21:00

Fork me on GitHub