Contribute to Open Source. Search issue labels to find the right project for you!

choo event name convention

datproject/dat-desktop

conversation started in https://github.com/datproject/dat-desktop/pull/315#pullrequestreview-28411454:

@yoshuawuyts

ohhh, we should probably decide on how to name events; in choo we usually delimit using : and camelcase for words. In the dat API we use whitespace to space words, and don’t think we have namespaces. I feel we should like agree on something and then stick to it. Perhaps : to namespace and whitespace to split words?

@yoshuawuyts

This is not a blocker btw, perhaps something to discuss either over text or face? feel we can figure it out in 5mins when talking haha

@karissa

+1 on having a convention

Updated 27/03/2017 08:03

MathematicalProgram does not error if constraints are added with Bindings to non-decision variables

RobotLocomotion/drake

I was hoping that if I called AddCost(Expression), where my Expression contains variables that had nothing to do with the MathematicalProgram, the AddCost would throw an error. It does not.

Is this a reasonable error to catch? I know we have a lot of entry points for costs/constraints. Where is the right place to do this check?

Updated 26/03/2017 18:43 1 Comments

Server does not normalize card position numbers

ClaytonPassmore/ProjectOrange

The server doesn’t currently normalize card position numbers, meaning the client can receive decks which have duplicated position numbers or gaps in position numbers. <img width=“378” alt=“screen shot 2017-03-25 at 7 32 53 pm” src=“https://cloud.githubusercontent.com/assets/13400887/24326975/dede9116-1191-11e7-9880-577e14e3a331.png”>

This causes trouble when trying to rearrange cards within a deck. On the client, since we use the index into the array as the authoritative position source for moving cards locally and re-normalize position after shuffling cards around, move actions are processed correctly. <img width=“523” alt=“screen shot 2017-03-25 at 7 35 03 pm” src=“https://cloud.githubusercontent.com/assets/13400887/24326990/3462c2ec-1192-11e7-825f-37f972b8c8cc.png”>

After sync, the deck returned from the server has gaps in the positions and the order of the cards is incorrect: <img width=“1058” alt=“screen shot 2017-03-25 at 7 36 27 pm” src=“https://cloud.githubusercontent.com/assets/13400887/24326996/75af14b2-1192-11e7-862f-8b2441728767.png”>

Updated 26/03/2017 21:27 1 Comments

Pagination in author's age

apekshaha/zkt
  1. Go to http://zaykakatadka.com/author/vijay-haldiya/
  2. All the recipes posted by this author come in a single page.

We need to build pagination in the authors page such that we show 9 recipes at a time. (9 recipes because in laptop it would mean 3 rows of recipes which is what would come in a single page without scrolling.)

Updated 25/03/2017 14:16

Allow <note> within <notesStmt>

music-encoding/music-encoding

It has always bothered me that <note> doesn’t occur within <notesStmt> as it logically should. Unfortunately, there’s this other thing that we use <note> for. 😀

One solution would be to put the kind of <note> that occurs within <notesStmt> in a different namespace. However, to do this just for one element seems a long way to go. Another possibility, better in my opinion, is to use something like the following ODD fragment, which creates an element called “genNote” (generic note), but then immediately renames it “note”. <elementSpec ident="genNote" module="MEI.shared" mode="add"> <!-- DO NOT REMOVE <altIdent>! It is used to rename <genNote> so that its name better matches its only possible parent, i.e., <notesStmt>. --> <altIdent>note</altIdent> <desc>(generic note) – Provides general information for which a specialized element has not been defined.</desc> <classes> <memberOf key="att.audience"/> <memberOf key="att.bibl"/> <memberOf key="att.common"/> <memberOf key="att.dataPointing"/> <memberOf key="att.lang"/> <memberOf key="att.source"/> <memberOf key="att.responsibility"/> <memberOf key="att.targetEval"/> </classes> <content> <rng:zeroOrMore> <rng:choice> <rng:text/> <rng:ref name="model.headLike"/> <rng:ref name="model.textComponentLike"/> <rng:ref name="model.textPhraseLike"/> <rng:ref name="model.editLike"/> <rng:ref name="model.transcriptionLike"/> </rng:choice> </rng:zeroOrMore> </content> <constraintSpec ident="genNote_content_constraint" scheme="isoschematron"> <constraint> <sch:rule context="mei:note[mei:head or mei:lg or mei:p or mei:quote or mei:table]"> <sch:assert test="not(mei:head[preceding-sibling::*[not(local-name()='head')]])">Head elements can only occur at the start of a note.</sch:assert> <sch:assert test="not(*[../text()[normalize-space()]])">Mixed content is not allowed when head, lg, p, quote or table elements are used.</sch:assert> <sch:assert test="not(*[not(local-name() eq 'biblList' or local-name() eq 'castList' or local-name() eq 'head' or local-name() eq 'lg' or local-name() eq 'list' or local-name() eq 'p' or local-name() eq 'quote' or local-name() eq 'relationList' or local-name() eq 'table')])" >Unstructured text not allowed when head, lg, p, quote or table elements are used.</sch:assert> </sch:rule> </constraint> </constraintSpec> <remarks> <p>This element is intended to be used for general notes that occur within <gi scheme="MEI" >notesStmt</gi>. The <att>resp</att> attribute records the editor(s) responsible for identifying or creating the note.</p> </remarks> </elementSpec> Essentially, this creates another <note> element, one with a different content model. Whereas the musical note looks like this in the RNG – <define name="mei_note"> <element name="note"> the ODD above generates – <define name="mei_genNote"> <element name="note"> making it possible to use mei_genNote in <notesStmt> and mei_note in <layer>. Same name, but completely different elements as far as the RNG is concerned.

At present, the only down side I see is that oXygen displays the same tool-tip for both contexts; i.e., a concatenation of the description of both notes.

Updated 27/03/2017 09:26 2 Comments

Clarify use of <annot>

music-encoding/music-encoding

The <annot> element tries to be too many things, leading to confusion about its purpose and how to render it. In an attempt to fix this situation, I propose – - <annot> be limited to explanatory material (text and non-text) that occurs (or has the potential to occur) within the musical notation. Thus, <annot> may occur wherever other so-called “control events” are allowed and can be rendered similar to <dir>. - <remark> be added to accommodate explanatory material that occurs outside the musical notation; i.e., marginalia, footnotes, etc. It may occur at the text phrase level as well as in ending, layer, lem, measure, part, perfMedium, pgDesc, rdg, score, section, sp, staff, syllable, and symbolDef. - add an @audience attribute to <annot> and <remark> with values of “public” and “private” to distinguish those features that are intended for internal use vs. those that are for all uses. Elements marked as “private” should not be rendered. - the main differences between <annot> and <remark> are a simpler content model for <annot> compared to <remark> and the values that may be used in their @place attribute. Also, <annot>, like <dir>, will be linkable to other items in the notation via its other attributes; e.g., @plist, @staff, @tstamp, etc., while <remark> will use @target to create an association with another element. - Both digitized and born-digital material may use these elements. Depending on the purpose of the encoding, it may be wise to wrap them in other elements, <add>, <del>, <supplied>, etc., to fully capture their editorial nature.

Updated 27/03/2017 09:33 4 Comments

NullPointerException

Rsl1122/Plan-PlayerAnalytics

Plan Version: 2.9.0 Server Version: Spigot 1.11 Database Type: MySQL Description:** My console: ERROR Some data might be corrupted: c5ac5699-0510-484f-9945-71746f292fbf ERROR Caught java.lang.NullPointerException. It has been logged to the Errors.txt

Steps to Reproduce:

  1. Nothing. It just happens.

Stack Trace - Console/Errors.txt contents: Errors.txt: [Mar 24 17:23:11] main.java.com.djrapitops.plan.data.cache.GetConsumer Caught java.lang.NullPointerException [Mar 24 17:23:11] main.java.com.djrapitops.plan.data.handlers.GamemodeTimesHandler.handleChangeEvent(GamemodeTimesHandler.java:60) [Mar 24 17:23:11] main.java.com.djrapitops.plan.data.handlers.GamemodeTimesHandler.handleLogin(GamemodeTimesHandler.java:36) [Mar 24 17:23:11] main.java.com.djrapitops.plan.data.listeners.PlanPlayerListener$1$1.process(PlanPlayerListener.java:79) [Mar 24 17:23:11] main.java.com.djrapitops.plan.database.databases.SQLDB.giveUserDataToProcessors(SQLDB.java:629) [Mar 24 17:23:11] main.java.com.djrapitops.plan.data.cache.GetConsumer.consume(DataCacheGetQueue.java:88) [Mar 24 17:23:11] main.java.com.djrapitops.plan.data.cache.GetConsumer.run(DataCacheGetQueue.java:73) [Mar 24 17:23:11] java.lang.Thread.run(Thread.java:745) [Mar 24 17:23:11] [Mar 24 17:23:37] main.java.com.djrapitops.plan.utilities.Analysis Caught java.lang.NullPointerException [Mar 24 17:23:37] main.java.com.djrapitops.plan.utilities.Analysis.lambda$analyzeData$5(Analysis.java:150) [Mar 24 17:23:37] java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184) [Mar 24 17:23:37] java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1374) [Mar 24 17:23:37] java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) [Mar 24 17:23:37] java.util.stream.ForEachOps$ForEachTask.compute(ForEachOps.java:291) [Mar 24 17:23:37] java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:731) [Mar 24 17:23:37] java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289) [Mar 24 17:23:37] java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056) [Mar 24 17:23:37] java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692) [Mar 24 17:23:37] java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157) [Mar 24 17:23:37] [Mar 24 17:23:55] main.java.com.djrapitops.plan.ui.webserver.Response Caught java.lang.NullPointerException [Mar 24 17:23:55] main.java.com.djrapitops.plan.utilities.PlaceholderUtils.getInspectReplaceRules(PlaceholderUtils.java:148) [Mar 24 17:23:55] main.java.com.djrapitops.plan.ui.DataRequestHandler.getInspectHtml(DataRequestHandler.java:58) [Mar 24 17:23:55] main.java.com.djrapitops.plan.ui.webserver.Response.sendStaticResource(Response.java:83) [Mar 24 17:23:55] main.java.com.djrapitops.plan.ui.webserver.WebSocketServer$1.run(WebSocketServer.java:77) [Mar 24 17:23:55] org.bukkit.craftbukkit.v1_11_R1.scheduler.CraftTask.run(CraftTask.java:71) [Mar 24 17:23:55] org.bukkit.craftbukkit.v1_11_R1.scheduler.CraftAsyncTask.run(CraftAsyncTask.java:52) [Mar 24 17:23:55] java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [Mar 24 17:23:55] java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [Mar 24 17:23:55] java.lang.Thread.run(Thread.java:745) [Mar 24 17:23:55] [Mar 24 17:23:59] main.java.com.djrapitops.plan.ui.webserver.Response Caught java.lang.NullPointerException [Mar 24 17:23:59] main.java.com.djrapitops.plan.utilities.PlaceholderUtils.getInspectReplaceRules(PlaceholderUtils.java:148) [Mar 24 17:23:59] main.java.com.djrapitops.plan.ui.DataRequestHandler.getInspectHtml(DataRequestHandler.java:58) [Mar 24 17:23:59] main.java.com.djrapitops.plan.ui.webserver.Response.sendStaticResource(Response.java:83) [Mar 24 17:23:59] main.java.com.djrapitops.plan.ui.webserver.WebSocketServer$1.run(WebSocketServer.java:77) [Mar 24 17:23:59] org.bukkit.craftbukkit.v1_11_R1.scheduler.CraftTask.run(CraftTask.java:71) [Mar 24 17:23:59] org.bukkit.craftbukkit.v1_11_R1.scheduler.CraftAsyncTask.run(CraftAsyncTask.java:52) [Mar 24 17:23:59] java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [Mar 24 17:23:59] java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [Mar 24 17:23:59] java.lang.Thread.run(Thread.java:745) [Mar 24 17:23:59] CONSOLE: 24.03 17:25:07 [Server] WARN Plugin Plan v2.9.0 generated an exception while executing task 180 24.03 17:25:07 [Server] [Informatie] java.lang.NullPointerException 24.03 17:25:07 [Server] [Informatie] at main.java.com.djrapitops.plan.data.handlers.GamemodeTimesHandler.saveToCache(GamemodeTimesHandler.java:79) ~[?:?] 24.03 17:25:07 [Server] [Informatie] at main.java.com.djrapitops.plan.data.cache.DataCacheHandler$3.process(DataCacheHandler.java:239) ~[?:?] 24.03 17:25:07 [Server] [Informatie] at main.java.com.djrapitops.plan.data.cache.DataCacheHandler.getUserDataForProcessing(DataCacheHandler.java:146) ~[?:?] 24.03 17:25:07 [Server] [Informatie] at main.java.com.djrapitops.plan.data.cache.DataCacheHandler.getUserDataForProcessing(DataCacheHandler.java:159) ~[?:?] 24.03 17:25:07 [Server] [Informatie] at main.java.com.djrapitops.plan.data.cache.DataCacheHandler.saveHandlerDataToCache(DataCacheHandler.java:242) ~[?:?] 24.03 17:25:07 [Server] [Informatie] at main.java.com.djrapitops.plan.data.cache.DataCacheHandler.lambda$saveHandlerDataToCache$1(DataCacheHandler.java:229) ~[?:?] 24.03 17:25:07 [Server] [Informatie] at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184) ~[?:1.8.0_101] 24.03 17:25:07 [Server] [Informatie] at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[?:1.8.0_101] 24.03 17:25:07 [Server] [Informatie] at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[?:1.8.0_101] 24.03 17:25:07 [Server] [Informatie] at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[?:1.8.0_101] 24.03 17:25:07 [Server] [Informatie] at java.util.stream.ForEachOps$ForEachTask.compute(ForEachOps.java:291) ~[?:1.8.0_101] 24.03 17:25:07 [Server] [Informatie] at java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:731) ~[?:1.8.0_101] 24.03 17:25:07 [Server] [Informatie] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289) ~[?:1.8.0_101] 24.03 17:25:07 [Server] [Informatie] at java.util.concurrent.ForkJoinTask.doInvoke(ForkJoinTask.java:401) ~[?:1.8.0_101] 24.03 17:25:07 [Server] [Informatie] at java.util.concurrent.ForkJoinTask.invoke(ForkJoinTask.java:734) ~[?:1.8.0_101] 24.03 17:25:07 [Server] [Informatie] at java.util.stream.ForEachOps$ForEachOp.evaluateParallel(ForEachOps.java:160) ~[?:1.8.0_101] 24.03 17:25:07 [Server] [Informatie] at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateParallel(ForEachOps.java:174) ~[?:1.8.0_101] 24.03 17:25:07 [Server] [Informatie] at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:233) ~[?:1.8.0_101] 24.03 17:25:07 [Server] [Informatie] at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:418) ~[?:1.8.0_101] 24.03 17:25:07 [Server] [Informatie] at java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:583) ~[?:1.8.0_101] 24.03 17:25:07 [Server] [Informatie] at main.java.com.djrapitops.plan.data.cache.DataCacheHandler.saveHandlerDataToCache(DataCacheHandler.java:228) ~[?:?] 24.03 17:25:07 [Server] [Informatie] at main.java.com.djrapitops.plan.data.cache.DataCacheHandler$1.run(DataCacheHandler.java:106) ~[?:?] 24.03 17:25:07 [Server] [Informatie] at org.bukkit.craftbukkit.v1_11_R1.scheduler.CraftTask.run(CraftTask.java:71) ~[spigot-1.11.jar:git-Spigot-f950f8e-0fa1ad2] 24.03 17:25:07 [Server] [Informatie] at org.bukkit.craftbukkit.v1_11_R1.scheduler.CraftAsyncTask.run(CraftAsyncTask.java:52) [spigot-1.11.jar:git-Spigot-f950f8e-0fa1ad2] 24.03 17:25:07 [Server] [Informatie] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_101] 24.03 17:25:07 [Server] [Informatie] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_101] 24.03 17:25:07 [Server] [Informatie] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_101]

Updated 25/03/2017 19:25 1 Comments

DOCENTES CON 2 MAILS NO APARECEN COMO REGISTRADOS EN LA WEB

BinPar/PRM

Alex y Noemí reportan el siguiente error: Al sacar un listado en histórico de promociones nos facilita la opción Alta WEB SI/NO. En ocasiones el docente está registrado, pero al exportar el EXCEL aparece como NO registrado. Esto sucede en docentes que tienen asignadas varias direcciones de correo en su perfil: Aunque marquemos el mail con el que está registrado, al sacar el histórico de promociones en ocasiones asigna otro correo (no registrado) y aquí está el problema. Contactamos con docentes que supuestamente no están registrados, y sí lo están. prm contactos con varios mails aparecen como no resgistrados en la web Uno de los baremos para medir la actividad de los promotores es el número de docentes que aparecen como registrados en el WEB. Si, a pesar de estar registrados , aparecen como no registrados por tener dos direcciones de mail, los datos no son reales. @CristianBinpar

Updated 24/03/2017 13:14

project-walkthrough: save_cache paths for python leads to "Command not found" in next builds

circleci/circleci-docs

In the project-walkthrough, it is suggested to cache the following directories for Python projects: - ~/.cache/pip - /usr/local/lib/python3.6/site-packages

This leads to errors like flake8: Command not found in next builds. With the cache in place, pip thinks flake8 is already installed while the bin directory is not restored.

To fix this, one can create a virtualenv and cache it, just like in circle 1: yml - restore_cache: v1-{{ .Branch }}-{{ checksum "requirements/dev.txt" }} - run: | python3 -m venv ~/venv . ~/venv/bin/activate pip install requirements/dev.txt - run: | . ~/venv/bin/activate make test - save_cache: key: v1-{{ .Branch }}-{{ checksum "requirements/dev.txt" }} paths: - "~/.pip/cache" - "~/venv"

Updated 24/03/2017 17:25 1 Comments

Test all visualizers against API promises systematically

DistrictDataLabs/yellowbrick

Potential testing idea - not sure how much of this is worth the time. These can be built into each visualizers unit tests, but would be good to have a basic set of checks that can be run for all visualizers.

To make sure that each new visualizer fulfills the few important API promises discussed yesterday, we should write unit tests that check that every visualizer meets them.

  • [ ] Either make a list of all visualizers or (preferred) automatically find all childeren of the Visualizer class and run test on each one.

Checks: - [ ] fit returns self - [ ] transform returns X - [ ] check that it can accept both numpy array and pandas dataframe - [ ] check that there is a docstring under the class name, and not one under init - [ ] use Inspect module to find all param names, and check that docstring at least contains all of these words somewhere in the docstring (to avoid undocumented params). - [ ] See if there is any other quick and easy formatting check we can do on the docstring. Not sure if there’s anything else worth adding. - [ ] Use inspect module to search for anything that looks like a hard-coded hex value in the draw method.

Updated 25/03/2017 17:32 3 Comments

java.lang.ArrayIndexOutOfBoundsException on /plan analyze

Rsl1122/Plan-PlayerAnalytics

Plan Version: 2.9.0 Server Version: git-Paper-1064 Database Type: sqlite Command Causing Issue: /plan analyze

Description: After running the analyze command in-game, I see “Analysis Command timed out!” and the error below in console.

Steps to Reproduce:

  1. Run /plan analyze in-game.

Stack Trace - Console/Errors.txt contents: java.lang.ArrayIndexOutOfBoundsException at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_45] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_45] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_45] at java.lang.reflect.Constructor.newInstance(Constructor.java:422) ~[?:1.8.0_45] at java.util.concurrent.ForkJoinTask.getThrowableException(ForkJoinTask.java:598) ~[?:1.8.0_45] at java.util.concurrent.ForkJoinTask.reportException(ForkJoinTask.java:677) ~[?:1.8.0_45] at java.util.concurrent.ForkJoinTask.invoke(ForkJoinTask.java:735) ~[?:1.8.0_45] at java.util.stream.ForEachOps$ForEachOp.evaluateParallel(ForEachOps.java:160) ~[?:1.8.0_45] at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateParallel(ForEachOps.java:174) ~[?:1.8.0_45] at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:233) ~[?:1.8.0_45] at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:418) ~[?:1.8.0_45] at java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:583) ~[?:1.8.0_45] at main.java.com.djrapitops.plan.utilities.Analysis.analyzeData(Analysis.java:147) ~[?:?] at main.java.com.djrapitops.plan.utilities.Analysis.analyze(Analysis.java:135) ~[?:?] at main.java.com.djrapitops.plan.utilities.Analysis$1.run(Analysis.java:67) ~[?:?] at org.bukkit.craftbukkit.v1_11_R1.scheduler.CraftTask.run(CraftTask.java:58) ~[patched_1.11.2.jar:git-Paper-1064] at org.bukkit.craftbukkit.v1_11_R1.scheduler.CraftAsyncTask.run(CraftAsyncTask.java:52) [patched_1.11.2.jar:git-Paper-1064] at com.destroystokyo.paper.ServerSchedulerReportingWrapper.run(ServerSchedulerReportingWrapper.java:22) [patched_1.11.2.jar:git-Paper-1064] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_45] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_45] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_45] Caused by: java.lang.ArrayIndexOutOfBoundsException at java.util.ArrayList.addAll(ArrayList.java:580) ~[?:1.8.0_45] at main.java.com.djrapitops.plan.utilities.Analysis.lambda$analyzeData$5(Analysis.java:191) ~[?:?] at main.java.com.djrapitops.plan.utilities.Analysis$$Lambda$33/852530624.accept(Unknown Source) ~[?:?] at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184) ~[?:1.8.0_45] at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1374) ~[?:1.8.0_45] at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:512) ~[?:1.8.0_45] at java.util.stream.ForEachOps$ForEachTask.compute(ForEachOps.java:291) ~[?:1.8.0_45] at java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:731) ~[?:1.8.0_45] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289) ~[?:1.8.0_45] at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056) ~[?:1.8.0_45] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1689) ~[?:1.8.0_45] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157) ~[?:1.8.0_45]

Updated 23/03/2017 15:24 1 Comments

Pricing headings in server/maas don't reduce in size in small screens

canonical-websites/www.ubuntu.com

Summary

Tiers and price headings in Pricing section don’t reduce in small screens

Process

  • On a large screen viewport, go to https://www.ubuntu.com/server/maas and scroll to bottom to Pricing section
  • “Pricing” heading is same size as tiers headings and prices
  • If you reduce the screen size to approximate mobile phone size, the “Pricing” heading reduces, but the other don’t

Current and expected result

All headings should reduce equally in small screens

Screenshot

<img width=“414” alt=“maas-pricing” src=“https://cloud.githubusercontent.com/assets/223966/24241998/6918673e-0fae-11e7-9d2a-21fc28931f82.png”>

Updated 23/03/2017 09:52

Add Guidance to "Contributing" on code documentation

DistrictDataLabs/yellowbrick

Please include a dummy or starter code example with perfectly documented code that a contributor can copy/ paste and get started that will work nicely. Also an example of real module that meets the perfect code documentation style.

Perhaps add an empty with the ‘perfect’ starter example code into examples/

Add this to: http://www.scikit-yb.org/en/latest/about.html#contributing

Updated 25/03/2017 17:34 3 Comments

SoftDeletes in Lists are not being filter out

octobercms/october
Expected behavior

When rendering a list, where columns have a relation to a model with a SoftDeleteTrait, I expect that the model adheres to the SoftDelete. So, soft deleted relations will NOT be visible in a list.

Actual behavior

When defining a relation to a model with a SoftDeleteTrait, the soft deleted record will still show up in the column of the list.

Even adding a seperated scope, besides the scope that is added automatically for a soft delete doesn’t change anything to the list rendering.

Reproduce steps

Model: Sections ```php <?php

namespace Models;

use Model; use October\Rain\Database\Traits\Validation;

/* * Model / class Sections extends Model { use Validation;

/*
 * Validation
 */
/**
 * @var array
 */
public $rules = [];

/*
 * Disable timestamps by default.
 * Remove this line if timestamps are defined in the database table.
 */
/**
 * @var bool
 */
public $timestamps = false;

/**
 * @var string The database table used by the model.
 */
public $table = 'hornbach_eventplanner_sections';

/**
 *
 */
public function beforeCreate()
{
    $this->available = $this->total;
}

/**
 * @var array
 */
public $hasMany = [
    'clients' => [
        'Models\Clients'
    ],
];

/**
 * @var array The following models have multiple records for one section
 */
public $belongsToMany = [
    'eventdates' => [
        'Models\Eventdates',
        'table' => 'eventdates_sections',
    ],
];

} ```

Model: Clients ```php <?php

namespace Models;

use Model; use October\Rain\Database\Traits\Validation; use October\Rain\Database\Traits\SoftDelete;

/* * Model / class Clients extends Model { use Validation, SoftDelete;

/**
 * Date columns for model.
 *
 * @var array $dates
 */
protected $dates = ['deleted_at'];

/*
 * Validation
 */
/**
 * @var array
 */
public $rules = [];

/*
 * Mass assignment
 */

/**
 * @var array
 */
protected $fillable = ['edit_link'];

/*
 * Disable timestamps by default.
 * Remove this line if timestamps are defined in the database table.
 */
/**
 * @var bool
 */
public $timestamps = true;

/**
 * @var string The database table used by the model.
 */
public $table = 'clients';

/**
 * @var array
 */
public $belongsTo = [
    'events' => ['Models\Events'],
    'sections' => ['Models\Sections'],
    'branches' => ['Models\Branches'],
    'eventdates' => ['Models\Eventdates'],
    'section' => ['Models\Sections'],
];

/**
 * @var array
 */
public $hasMany = [
    'clientdata' => ['Models\ClientData'],
    'clientfile' => ['Models\ClientFile'],
    'payments' => ['Models\Payments'],
];

/**
 * @var array
 */
public $hasOne = [
    'branche' => ['Models\Branches'],
];

} ```

Sections –> Columns.yaml columns: name: label: 'lang.name' type: text total: label: Total type: text available: label: Available type: text customername: label: Cusomer relation: clients select: concat(firstname, ' ', lastname) email: label: E-mail relation: clients type: text select: email

The list is being generated correctly, but the SoftDeleted relations will still show up in the list, in this case in the columns customername and email which are both related to ‘clients’.

Also adding a condition in the model ‘sections’ under ‘hasMany’ has no influence on the result.

I’ve narrowed it down to the file: modules/backend/widgets/Lists.php. On line 455 explicits ignores using scopes in the count query, resulting in not using the SoftDelete scope. $countQuery = $relationObj->getRelationCountQuery($relationObj->getRelated()->newQueryWithoutScopes(), $query);

When changing newQueryWithoutScopes to newQuery, the scope for SoftDelete is being used.

Result before: sql SELECT `sections`.*, (SELECT group_concat(concat(firstname, ' ', lastname) SEPARATOR ', ') FROM `clients` WHERE `clients`.`sections_id` = `sections`.`id`) AS `klantnaam`, (SELECT group_concat(email SEPARATOR ', ') FROM `clients` WHERE `clients`.`sections_id` = `sections`.`id`) AS `email`, `sections`.`eventdates_id` AS `pivot_eventdates_id`, `sections`.`sections_id` AS `pivot_sections_id` FROM `sections` INNER JOIN `sections` ON `sections`.`id` = `sections`.`sections_id` WHERE `sections`.`eventdates_id` = 47 ORDER BY `name` DESC

Result after changing the line in Lists.php

SELECT
  `sections`.*,
  (SELECT group_concat(concat(firstname, ' ', lastname) SEPARATOR ', ')
   FROM `clients`
   WHERE `clients`.`deleted_at` IS NULL AND
         `clients`.`sections_id` = `sections`.`id`) AS `klantnaam`,
  (SELECT group_concat(email SEPARATOR ', ')
   FROM `clients`
   WHERE `clients`.`deleted_at` IS NULL AND
         `clients`.`sections_id` = `sections`.`id`) AS `email`,
  `eventdates_sections`.`eventdates_id`                                   AS `pivot_eventdates_id`,
  `eventdates_sections`.`sections_id`                                     AS `pivot_sections_id`
FROM `sections`
  INNER JOIN `eventdates_sections`
    ON `sections`.`id` = `eventdates_sections`.`sections_id`
WHERE `eventdates_sections`.`eventdates_id` = 47
ORDER BY `name` DESC
October build

396

Updated 27/03/2017 07:57 1 Comments

Tutorial: Train users to teach that labels disappear from the GSV screen when they walk

ProjectSidewalk/SidewalkWebpage

Update the tutorial script to teach them that the previously applied labels GSV won’t be visible when you walk from the point where you applied those labels, however, can be seen in the top-down map.

Related comment from #531:

Jon: we need to better onboard the fact that labels disappear in the GSV interface after you take a step so that users should focus on top-down map for previously placed labels. Could also show user that the labels are still there if they step back in the labeling location.

Related issue: https://github.com/ProjectSidewalk/SidewalkWebpage/issues/502 - pointing to the top down map to track labels.

Updated 22/03/2017 18:05

Tutorial: Improve instruction text to pan at the two curb ramp corner

ProjectSidewalk/SidewalkWebpage

From #531:

Jon: I have observed users getting confused here too. They now see two curb ramps to label but the tutorial interface is telling them to continue panning. I think the UI needs to acknowledge this someway and say something like ‘keep panning to the right’ or something. Otherwise, I’ve seen users stop here and try to label the curb ramps (before they get to the correct pan position for the tutorial to continue). image

Solution: Update the text to say something like “continue till you see two curb ramps”.

Updated 22/03/2017 18:46 3 Comments

Bugs on Insights sticky RTP and insights subscribe

canonical-websites/www.ubuntu.com

Summary

There’s 2 issues on Insights, one is that the RTP sticky isn’t displaying on webinar type posts (see 1st screenshot) and the second is that when trying to sign up to Insights updates there seems to be an issue as well (see 2nd screenshot) [please describe the issue]

Process

[list the steps to replicate the issue]

Current and expected result

1st problem: instead of showing the sticky RTP widget on the sidebar, I see the ‘ubuntu cloud’ with formatting issues.

2nd problem: sign up for email updates also looks like it’s having formatting issues and the confirmation message looks strange.

[describe what happened and what you expected]

Screenshot

image

image

[if relevant, include a screenshot]

Updated 23/03/2017 23:06 1 Comments

Tutorial: Disable specific labeling order for the two curb ramp corner

ProjectSidewalk/SidewalkWebpage

Allow the user to label either of the curb ramps first and not restrict them to follow a specific order as imposed currently by the instructions.

From Jon’s comment in #531:

  1. This has been commonly said among the team but we should probably keep both of these arrows up and let the user label the curb ramps in the order they want rather than enforcing this weird right-to-left order that we do now (which confuses users). image See, I’ve already labeled the curb ramp but now the tutorial interface is telling me to label it again. This should not happen. It’s confusing! image

Manaswi: Yes, this would have happened if you labeled the left curb ramp first while the script wanted you to finish the right one first. Would have to update the script to work for either curb ramps.

Updated 22/03/2017 17:42

Tutorial: Providing corrective guidance when the user doesn't follow the given instruction

ProjectSidewalk/SidewalkWebpage

We should provide corrective guidance when the user makes a mistake while following through onboarding instructions (e.g. uses a missing curb ramp label instead of a curb ramps label; starts panning in the opposite direction.) The current implementation does have some form of guidance e.g. when the user doesn’t apply the right severity rating. More enhancements are needed.

The other (or additional) solution is: disabling interface elements based on the step the user is currently in. This will restrict user’s actions and would allow the user to concentrate on the correct action (for e.g. applying the correct label).

From Jon’s comment:

  1. The tutorial should recognize when the user has done the wrong thing and offer corrective advice. In the following case, for example, the user selected the wrong label type and placed it in the wrong area. (Rather than or in addition to corrective guidance, we could restrict the tutorial interface a bit to only allow users to click on the labels that we want them to–i.e., the flashing things in the interface) image

Mentioned in #531

Updated 22/03/2017 17:02

Adds Ticker priority argument

pixijs/pixi.js

Overview

Inspired by @drkibitz’s comment https://github.com/pixijs/pixi.js/issues/3835#issuecomment-287284349. This adds an optional priority argument to Ticker.prototype.add(fn, context, priority) and Ticker.prototype.addOnce(fn, context, priority). This also adds unit-tests for the Ticker class.

This should address #3835.

Updated 25/03/2017 05:07 11 Comments

Use mapreduce to cut down on the SQL file generation time for an OSM API database ingest

ngageoint/hootenanny

It currently takes days to generate the SQL file when converting it from a planet file to an executable sql file. It has 21.5 billion records in it (yes billion with a “b”) as of 3/24/17.

We should be able to use some of the hadoop code already in hoot to help speed this process up. It goes without saying that all of this assumes an offline database that has been pre-prepped for bulk load and has no database constraints activated. IMPORTANT: Also, this workflow will have to give up OSM data validation (missing way nodes, etc.) like the current writer does. So this would only ever be used against data that previously existed within a database and had already been validated (planet data).

Possible workflow:

  • subclass OsmMapMapper to write a mapper class that uses the pbf reader to read in data and writes the corresponding sql statement for each element out to file; each record will arbitrarily get a changeset ID = 1; record ordering won’t matter since db constraints are turned off
    • What’s tricky here is where to write the data. I think we want each table type to go to its own file. That saves us from having to do a complete additional pass over the data to combine all the files. Not to mention that if we use mapreduce to also exec the sql, having a combined file buys us nothing anyway. In Java you can use the MultipleOutputs class to write to multiple outputs, but I don’t think we have that available in hoot’s pretty-pipes. So, either implement that (could be very hard) or we need to come up with some other way.
  • Then the reducer would end up combining all the sql statements written by the mappers for each table type into the same file. It wouldn’t need to bother sorting the statements by record id, however, since the db constraints are off. It also needs to keep a total count of elements processed by type.
  • after the first pass through the data, we’ll have to do one more to pass to update the changeset IDs. So if possible, write a mapper that takes in a changeset ID and updates all SQL records with that ID. The jobs would have to be specifically sized so that each mapper read a number of features equal to the maximum changeset size (is that possible?) problems:
    • how do you know what changeset ID to give each mapper; is there a way to globally assign an ID through the configuration? (man, its been a long time since I’ve done hadoop code)

So if the above problems are overcome, we should be able to mapreduce the sql file writing part in two jobs. Differences from the non-mapreduce version would be that exec'ing the output would not be transaction safe b/c the statements would be spread out across multiple files. I don’t think we care about this in the offline workflow, though, even though the non-mapreduce writer still executes within a transaction (maybe it shouldn’t?).

Also, none of the above files have the table headers written to them. So, if we planned to exec the sql in a non-mapreduce fashion we should just be able to prepend the table header line to each file. If we were going exec the sql in a mapreduce fashion, then each mapper (or reducer?) would be responsible for prepending the header to each set of exec’d sql statements just before they were exec’d against the database.

If this task is successful, then a follow on task should possibly be created to port the SQL execution code to mapreduce as well. Additionally, if pg_bulkload ends up getting approved then we’d want to convert that complete workflow to mapreduce as well, as that workflow would probably end up being the fastest of all.

Updated 24/03/2017 14:16 1 Comments

Automatically Refreshed Analysis does not place data in the InspectCache

Rsl1122/Plan-PlayerAnalytics

Plan Version: 2.8.4 Server Version: - Database Type: - Cause: Automatic Analysis Refresh

Description: Inspect cache is empty even if automatic analysis is on, and all requests to /player/<player> return 404 even right after the automatic refresh.

Steps to Reproduce:

  1. Enable automatic analysis refresh in the config
  2. Wait for the inspect cache to clear after boot analysis
  3. Wait for the first automatic refresh to run
  4. Request a inspect page -> Returns 404

Proposed Solutions:

  • Stop clearing inspect cache
  • Fix inspect cache clearing
Updated 22/03/2017 10:40

Unterminated comment when running migrations - php artisan october:up

octobercms/october
Expected behavior

I used the builder plugin to make my migrations and added phpDoc block comments manually. I would expect it to run even with the comments.

Actual behavior

With the comments it is giving errors when running php artisan october:up in the CLI. Without them it is working like it should. See warning below. Related to this closed issue https://github.com/octobercms/october/issues/2421

PHP Warning: Unterminated comment starting line 25 in /var/www/gi/vendor/october/rain/src/Database/Updater.php on line 110 PHP Stack trace: PHP 1. {main}() /var/www/gi/artisan:0 PHP 2. Illuminate\Foundation\Console\Kernel->handle() /var/www/gi/artisan:36 PHP 3. Symfony\Component\Console\Application->run() /var/www/gi/vendor/laravel/framework/src/Illuminate/Foundation/Console/Kernel.php:107 PHP 4. Symfony\Component\Console\Application->doRun() /var/www/gi/vendor/symfony/console/Application.php:119 PHP 5. Symfony\Component\Console\Application->doRunCommand() /var/www/gi/vendor/symfony/console/Application.php:188 PHP 6. Illuminate\Console\Command->run() /var/www/gi/vendor/symfony/console/Application.php:843 PHP 7. Symfony\Component\Console\Command\Command->run() /var/www/gi/vendor/laravel/framework/src/Illuminate/Console/Command.php:136 PHP 8. Illuminate\Console\Command->execute() /var/www/gi/vendor/symfony/console/Command/Command.php:261 PHP 9. Illuminate\Container\Container->call() /var/www/gi/vendor/laravel/framework/src/Illuminate/Console/Command.php:150 PHP 10. call_user_func_array:{/var/www/gi/vendor/laravel/framework/src/Illuminate/Container/Container.php:507}() /var/www/gi/vendor/laravel/framework/src/Illuminate/Container/Container.php:507 PHP 11. System\Console\PluginRefresh->fire() /var/www/gi/vendor/laravel/framework/src/Illuminate/Container/Container.php:507 PHP 12. System\Classes\UpdateManager->updatePlugin() /var/www/gi/modules/system/Console/PluginRefresh.php:54 PHP 13. System\Classes\VersionManager->updatePlugin() /var/www/gi/modules/system/Classes/UpdateManager.php:470 PHP 14. System\Classes\VersionManager->applyPluginUpdate() /var/www/gi/modules/system/Classes/VersionManager.php:94 PHP 15. System\Classes\VersionManager->applyDatabaseScript() /var/www/gi/modules/system/Classes/VersionManager.php:141 PHP 16. October\Rain\Database\Updater->setUp() /var/www/gi/modules/system/Classes/VersionManager.php:392 PHP 17. October\Rain\Database\Updater->resolve() /var/www/gi/vendor/october/rain/src/Database/Updater.php:22 PHP 18. October\Rain\Database\Updater->getClassFromFile() /var/www/gi/vendor/october/rain/src/Database/Updater.php:74 PHP 19. token_get_all() /var/www/gi/vendor/october/rain/src/Database/Updater.php:110

And the migration code

<?php

namespace NotPaper\GidesignCms\Updates;

use Schema;
use October\Rain\Database\Updates\Migration;

/**
 * Class BuilderTableUpdateNotpaperGidesigncmsBanners3
 * @package NotPaper\GidesignCms\Updates
 */
class BuilderTableUpdateNotpaperGidesigncmsBanners3 extends Migration
{
    /**
     * Drop column banner.
     */
    public function up()
    {
        Schema::table('notpaper_gidesigncms_banners', function($table)
        {
            $table->dropColumn('banner');
        });
    }

    /**
     * Add column banner.
     */
    public function down()
    {
        Schema::table('notpaper_gidesigncms_banners', function($table)
        {
            $table->string('banner', 255);
        });
    }
}
Reproduce steps

Use the builder to add migrations and add phpDoc block. Then it gives this warning.

October build

Sorry, can’t find the build number.

Updated 27/03/2017 08:01 6 Comments

Bugged quest in Duskwood

talamortis/azerothcore-classic

Description of the Issue: Morbent fel quest you cannot dispel his shield, if you equip the item it says “there is nothing to dispel” making the quest unbeatable

QuestID/ BossID: 55

Instance:

What should happen: It should get rid of his shield so you can smash his face

Faction: Horde/Alliance Alliance

How to repoduce: Try the quest

Updated 21/03/2017 21:21

Count number of times integrator calculated derivatives.

RobotLocomotion/drake

To be able to test integrator performance, it is helpful to know how many times (integer count) the time-derivatives of the states are calculated.

This is also helpful for testing - where these numbers should be relatively constant for each benchmark test as he entire code base evolves. If the number changes significantly, it should be a well-understood phenomenon that caused it (e.g., a model of friction changed).

Note: This feature request is specific to numerical integration. It is unrelated to the system architecture (e.g., unrelated to system code such as input or output ports).

Updated 21/03/2017 18:56 1 Comments

Sourcecomms mutes admins every round.

sbpp/sourcebans-pp

Current Behavior

My servers that run TF2Jailbreak v 5.5.8 (any version actually). Sourcecomms mutes every admin at the start of every round, and on reconnect too.

However, there are no logs of these mutes happening. Our admins have to manually mute themselves and then unmute themselves for the mute to be removed.

WithSourceComms" ((SB++) 1.5.4.6) by Alex, Sarabveer(VEER™) enabled on “SourceBans++” (1.5.4.6) by SourceBans Development Team, Sarabveer(VEER™) this happens.

Whenever I disable sourcecomms and change the map, admins are no longer automatically muted.

Updated 21/03/2017 12:50

API: Resources uploaded via that API do not display image preview

Cadasta/cadasta-platform

Steps to reproduce the error

  • Find an aws link to a resource already uploaded
  • send a post request to /api/v1/organization/<blah>/projects/<blarg>/spatial/<blarf-id>/resources/ file=<aws_link> name='blarg' original_file='image.jpg' AUTHORIZATION:'Token <api key>'

Actual behavior

  • Resource gets added to the location, but there’s no preview image
  • When you download it, it is the correct resource

<img width=“610” alt=“screen shot 2017-03-20 at 4 36 31 pm” src=“https://cloud.githubusercontent.com/assets/8128188/24120557/67e9bf4a-0d8b-11e7-9892-bab846fb987e.png”>

Expected behavior

  • Preview image is created.
Updated 27/03/2017 12:23 8 Comments

Architectural Ideas

erayd/json-schema-info

Interesting idea, thanks for putting this together. I have two discussion points to kick things off with:

  1. Abstract rules out such that they are language-agnostic. This might mean that they have to go in a separate repo, not sure, but the idea is that the rules are encoded once in a single place, and then any language can consume them, not just PHP. The format could be anything, really, but obvious suggestions are JSON, YAML, XML, and, well, I’ll stop there. Perhaps it could live in the official json-schema-spec repo?

  2. Separate the draft versions. Getting back to PHP and putting # 1 aside for the moment, the suggestion is to rearrange things such that lower schemas have no knowledge of the higher schemas. A few obvious ideas are to use a class hierarchy, or a system of config files that would be layered together. We’ll have to sort out the details along the way, but I think that with either system all rules could be queried with any level of draft, and all rules would be implicitly off.

Updated 26/03/2017 21:51 18 Comments

cmake option WITH_ALL_SUPPORTED_EXTERNALS gives vtk target error

RobotLocomotion/drake

In a clean build directory, setting WITH_ALL_SUPPORTED_EXTERNALS=ON gives the following configuration error:

 CMake Error at cmake/modules/3.7/ExternalProject.cmake:2437 (add_custom_target):
   add_custom_target cannot create target "vtk" because an imported target
   with the same name already exists.
 Call Stack (most recent call first):
   cmake/externals.cmake:217 (ExternalProject_Add)
   cmake/externals.cmake:520 (drake_add_cmake_external)
   CMakeLists.txt:160 (drake_add_external)



 CMake Error at cmake/modules/3.7/ExternalProject.cmake:1175 (message):
   External project "vtk" has no stamp_dir
 Call Stack (most recent call first):
   cmake/modules/3.7/ExternalProject.cmake:1410 (ExternalProject_Get_Property)
   cmake/modules/3.7/ExternalProject.cmake:1452 (_ep_get_step_stampfile)
   cmake/externals.cmake:107 (ExternalProject_Add_Step)
   cmake/externals.cmake:529 (drake_forceupdate)
   CMakeLists.txt:160 (drake_add_external)

This is with cmake 3.5.2 on Ubuntu 14.04.

Updated 20/03/2017 21:36 1 Comments

[Design] user flow and design document

airplake/airplake-foundation-core

(公共页) Homepage

  • 社区介绍
  • 当前正在进行的项目
  • 社区活跃会员排行榜
  • header: 主页 | 社区pulse | 登陆 | 注册

(内页)我的任务

  • 添加,删除,编辑任务
  • 对任务发起审核请求
  • 已完成任务历史列表

(内页)我的信息

  • 编辑我的信息(手机号,微信号,邮箱号,银行卡号,收款地址,收款人)
  • Stats: 已经完成总积分,当前进行中积分
  • 周活跃度(credits by week)
Updated 20/03/2017 12:01 4 Comments

Redundancy in uploading images

fossasia/phimpme-android

Actual Behaviour

The images are added again if selected from the image selector in the upload section.

Expected Behaviour

The uploaded image should not be readded. A unique hash should exist for each image.

Steps to reproduce it

  1. Go to upload tab and upload image from the image selector.
  2. After uploading, click on upload image button again.
  3. The image selector opens, and same images can be added again.

Screenshots of the issue

Images 2 and 4 are same.

screenshot_2017-03-20-02-32-38-040_vn mbm phimp me

Would you like to work on the issue?

Yes, I would like to work on it.

Updated 20/03/2017 09:17 1 Comments

App Crash in uploading image

fossasia/phimpme-android

Actual Behaviour

The app is crashing in the uploading section in a particular workflow.

Expected Behaviour

The app should not crash.

Steps to reproduce it

  1. Open upload section.
  2. Add images to upload.
  3. After the images are added in the upload fragment, select one of the images to add filters to it.
  4. DO not change anything, and click discard.
  5. Click the same image again.

LogCat for the issue

screenshot from 2017-03-20 02-33-40

Would you like to work on the issue?

Yes, I would like to work on it.

Updated 20/03/2017 06:11 1 Comments

Back button is not working properly

fossasia/phimpme-android

Actual Behaviour

If we click back button in gallery activity then it shows dialog box but in other activity on pressing back button app closed.

Expected Behaviour

it should not happen.

Steps to reproduce it

Add steps to reproduce bugs or add information on the place where the feature should be implemented. Add links to a sample deployment or code.

LogCat for the issue

Provide logs for the crash here

Screenshots of the issue

Where-ever possible attach a screenshot of the issue.

Would you like to work on the issue?

yes

Updated 19/03/2017 17:13 1 Comments

Fork me on GitHub