Contribute to Open Source. Search issue labels to find the right project for you!

Webapp not loading. Errors in console

orizens/echoes-ng2

polyfills.8ac47b1b34705312c109.bundle.js:5 EXCEPTION: Invalid event targett.handleError @ polyfills.8ac47b1b34705312c109.bundle.js:5 polyfills.8ac47b1b34705312c109.bundle.js:5 ORIGINAL STACKTRACE:t.handleError @ polyfills.8ac47b1b34705312c109.bundle.js:5 polyfills.8ac47b1b34705312c109.bundle.js:5 TypeError: Invalid event target at Function.e.setupSubscription (main.dda060a7a5f5e821a156.bundle.js:7) at e._subscribe (main.dda060a7a5f5e821a156.bundle.js:7) at e.t.subscribe (polyfills.8ac47b1b34705312c109.bundle.js:1) at t.call (main.dda060a7a5f5e821a156.bundle.js:9) at t.subscribe (polyfills.8ac47b1b34705312c109.bundle.js:1) at t.call (thirdparty.c457ec9d14808cfb9795.bundle.js:5) at t.subscribe (polyfills.8ac47b1b34705312c109.bundle.js:1) at t.call (thirdparty.c457ec9d14808cfb9795.bundle.js:2) at t.subscribe (polyfills.8ac47b1b34705312c109.bundle.js:1) at t.attachEvent (main.dda060a7a5f5e821a156.bundle.js:2)t.handleError @ polyfills.8ac47b1b34705312c109.bundle.js:5 polyfills.8ac47b1b34705312c109.bundle.js:8 TypeError: Invalid event target(…)o @ polyfills.8ac47b1b34705312c109.bundle.js:8 chrome-extension://enhhojjnijigcajfphajepfemndkmdlo/cast_sender.js Failed to load resource: net::ERR_FAILED chrome-extension://fmfcbgogabcbclcofgocippekhfcmgfj/cast_sender.js Failed to load resource: net::ERR_FAILED chrome-extension://fjhoaacokmgbjemoflkofnenfaiekifl/cast_sender.js Failed to load resource: net::ERR_FAILED

Updated 27/03/2017 05:01 3 Comments

Verify and add new BOM

Hooklet3d/OdinOne

I have added a ‘working directory’ with the current data for the BOM and the kit parts count. If anyone has a chance please check this out and verify the numbers.

Any thoughts on the best way to log these numbers would be great, I like anything that automates the process. Currently they are in excel sheets.

stand by for verification on working dir…. will share commit

Updated 21/03/2017 02:34

No module named 'awscli'

uboslinux/ubos-packages

Attempting to backup to Amazon S3 with encryption on a new machine:

ERROR: Traceback (most recent call last):
  File "/usr/bin/aws", line 19, in <module>
    import awscli.clidriver
ModuleNotFoundError: No module named 'awscli'
Updated 19/03/2017 05:03 1 Comments

Error during backup

uboslinux/ubos-admin
Undefined subroutine &UBOS::Backup::ZipFileBackupContext::warning called at /usr/lib/perl5/vendor_perl/UBOS/Backup/ZipFileBackupContext.pm line 160.
[root@ubos-pc backups]# vi /usr/lib/perl5/vendor_perl/UBOS/Backup/ZipFileBackupContext.pm 

Missing to import UBOS::Logging or such.

Updated 21/03/2017 03:30

Define es:top-level or equivalent

marklogic/entity-services

When putting entity instances into documents, one always chooses one to be the root, and others to embed or denormalize into the document structure.

Entity Services in MarkLogic 9 does not provide this vocabulary, but repeated use of the code generation indicates that it could be automated significantly with this extra concept.

  1. Entry point of a module is known.
  2. Expectations of whether a reference property contains an embedded instance or a reference to an external one.
  3. Refinement of extraction template to require less customization.
Updated 20/03/2017 18:29

schema-gen() returns different results for xml and json when an entity type has duplicate property name

marklogic/entity-services

This issue was found when testing #212 .

Sample doc: Attached a zip file

Query: import module namespace es = ‘http://marklogic.com/entity-services’ at ‘/MarkLogic/entity-services/entity-services.xqy’;

es:schema-generate( es:model-validate(doc(‘invalid-bug43212.json’))), es:schema-generate( es:model-from-xml( fn:doc( ‘invalid-bug43212.xml’)))

Test Input file.zip Test Input file.zip

Issue: With JSON input file schema-gen() returns an additional comment as below: <!-- XSD schemas prohibit duplicate element names. This element is commented out because it conflicts with another of the same name. <xs:element name="OrderID" type="xs:string" xmlns:xs="http://www.w3.org/2001/XMLSchema"/> -->

Updated 15/03/2017 14:44 3 Comments

Eclipse reports QueryBatcherIteratorTest having a resource leak.

marklogic/java-client-api

Test method

 public void test_A_OnDiskIterator() throws Exception

at reader object, because it is not closed. We do a file in the /tmp folder with name QueryBatcherIteratorTest_

// now we have the uris, let’s step through them and do nothing with them AtomicInteger successDocs2 = new AtomicInteger(0); BufferedReader reader = new BufferedReader(new FileReader(tempFile));

Updated 20/03/2017 22:40 1 Comments

@OneToMany assosication not loading with query kundera

impetus-opensource/Kundera

My code is below

@Entity @Table(name = “promo_code”) public class PromoCode { @Id @GeneratedValue(strategy = GenerationType.AUTO) @Column(name = “promo_id”) private Integer promo_id;

@OneToMany(cascade = { CascadeType.ALL }, fetch = FetchType.EAGER)
@JoinColumn(name = "promo_id")
private List<PromoChannelMapping> channels;

@Column(name = "budget")
private Double budget;

@Column(name = "start_date")
private Date start_date;

//setter and getters }

@Entity @Table(name = “promo_channel_mapping”) public class PromoChannelMapping { @Id @GeneratedValue(strategy = GenerationType.AUTO) @Column(name = “channel_mapping_id”) private Integer channel_mapping_id;

@Column(name = "promo_id")
private Integer promo_id;

@OneToOne(cascade = { CascadeType.ALL }, fetch = FetchType.EAGER)
@JoinColumn(name = "channel_id")
private Channel channel_id;

@Column(name = "created_date")
private Timestamp created_date;

@Column(name = "updated_date")
private Timestamp updated_date;

@Column(name = "created_by")
private Integer created_by;

@Column(name = "updated_by")
private Integer updated_by;

//setter and getters }

EntityManagerFactory emf = Persistence.createEntityManagerFactory(“laalsa_sql”); EntityManager em = emf.createEntityManager(); Query q = em.createQuery(“Select p from PromoCode p”); List<?> results = q.getResultList();

result is empty array every time.

I am using below dependencies

<dependency>
        <groupId>com.impetus.kundera.client</groupId>
        <artifactId>kundera-mongo</artifactId>
        <version>3.6</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/com.impetus.kundera.client/kundera-rdbms -->
    <dependency>
        <groupId>com.impetus.kundera.client</groupId>
        <artifactId>kundera-rdbms</artifactId>
        <version>3.6</version>
    </dependency>

Please help me

Updated 01/03/2017 10:45 16 Comments

Menu Order

bueltge/Adminimize

I have Adminimize installed along side WP-CRM System plugin. WP-CRM hooks on the filter custom_menu_order to change the order of its sub-menu (by altering the global $submenu variable). Adminimize caches & displays revised indexed values of the sub-menu in its settings page.

However, when Adminimize hides selected menus for a role, for a user with that role who logs into the dashboard, Adminimize hides the wrong sub-menus because it hooks onto the action admin_menu which fires before the filter custom_menu_order. As a result the wrong menu gets hidden.

To fix this, I found that simply changing the Adminimize function _mw_adminimize_set_menu_option hook from the action admin_menu to the filtercustom_menu_order works beautifully.

This is on line 437 of of the file adminimize.php which I changed to,

add_filter( 'custom_menu_order', '_mw_adminimize_set_menu_option', 99999 );

via https://wordpress.org/support/topic/bug-adminize-wrong-submenu-page-in-re-ordered-sub-menus/

Updated 28/02/2017 15:25

Handling NPE withBlackList and withWhiteList methods

marklogic/java-client-api

Handling NPE withBlackList and withWhiteList methods

Test:

@Test
    public void testWhiteList() throws Exception{
        Assume.assumeTrue(hostNames.length > 1);

        final String query1 = "fn:count(fn:doc())";

        try{
            DocumentMetadataHandle meta6 = new DocumentMetadataHandle().withCollections("NoHost").withQuality(0);

            Assert.assertTrue(dbClient.newServerEval().xquery(query1).eval().next().getNumber().intValue() == 0);

            WriteBatcher ihb2 =  dmManager.newWriteBatcher();

            FilteredForestConfiguration forestConfig = new FilteredForestConfiguration(dmManager.readForestConfig())
                            .withBlackList(null);
}

Exception:

java.lang.NullPointerException
    at com.marklogic.client.datamovement.FilteredForestConfiguration.withBlackList(FilteredForestConfiguration.java:172)
    at com.marklogic.client.datamovement.functionaltests.WriteHostBatcherTest.testWhiteList(WriteHostBatcherTest.java:2651)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
    at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
    at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
    at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
    at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
    at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
    at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
    at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
    at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
    at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
    at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
    at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
    at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
    at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:86)
    at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
    at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:459)
    at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:675)
    at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:382)
    at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:192)
Updated 23/02/2017 20:01

proxy/packet.py send_slot doesn't support string type item ids

benbaptist/minecraft-wrapper

Discovered this whilst trying to update ‘example-plugins/open.py’

Caused by ‘proxy/packet.py’ line 410 The newer versions of minecraft use strings rather than shorts for item ids. A special case needs adding to handle newer and older versions of minecraft to send the respective data type.

I would add a pull request but I haven’t been through enough existing code yet to see how you handle version specific cases

Updated 21/02/2017 00:08 5 Comments

Flr 294

LigaData/Kamanja

Used the class MetadataAPISerialization for generating Json corresponding GetAll API functions. For adapters, Json result will look something like as shown below

“Adapters”: [ { “Adapter”: { “Name”: “MedicalInput”, “TypeString”: “Input”, “ClassName”: “com.ligadata.kafkaInputOutputAdapters_v10.KamanjaKafkaConsumer$”, “JarName”: “kamanjakafkaadapters_0_10_2.11-1.6.1.jar”, “DependencyJars”: [ “kafka-clients-0.10.0.1.jar”, “KamanjaInternalDeps_2.11-1.6.1.jar”, “ExtDependencyLibs_2.11-1.6.1.jar”, “ExtDependencyLibs2_2.11-1.6.1.jar” ], “AdapterSpecificCfg”: “{\"HostList\”:\“localhost:9092\”,\“TopicName\”:\“medicalinput\”}“, "TenantId”: “tenant1”, “FullAdapterConfig”: “{\"TenantId\”:\“tenant1\”,\“DependencyJars\”:[\“kafka-clients-0.10.0.1.jar\”,\“KamanjaInternalDeps_2.11-1.6.1.jar\”,\“ExtDependencyLibs_2.11-1.6.1.jar\”,\“ExtDependencyLibs2_2.11-1.6.1.jar\”],\“ClassName\”:\“com.ligadata.kafkaInputOutputAdapters_v10.KamanjaKafkaConsumer$\”,\“Name\”:\“MedicalInput\”,\“AdapterSpecificCfg\”:{\“HostList\”:\“localhost:9092\”,\“TopicName\”:\“medicalinput\”},\“TypeString\”:\“Input\”,\“JarName\”:\“kamanjakafkaadapters_0_10_2.11-1.6.1.jar\”}“ } }, { "Adapter”: { “Name”: “Storage_1”, “TypeString”: “Storage”, “ClassName”: “”, “JarName”: “”, “DependencyJars”: [], “AdapterSpecificCfg”: “”, “TenantId”: “tenant1”, “FullAdapterConfig”: “{\"TenantId\”:\“tenant1\”,\“Location\”:\“/media/home3/installKamanja161/Kamanja-1.6.1_2.11/storage/tenant1_storage_1\”,\“Name\”:\“Storage_1\”,\“portnumber\”:\“9100\”,\“StoreType\”:\“h2db\”,\“TypeString\”:\“Storage\”,\“connectionMode\”:\“embedded\”,\“SchemaName\”:\“testdata\”,\“user\”:\“test\”,\“password\”:\“test\”}“ } },

Updated 07/02/2017 09:30

Ars Magica 2

combak/ShatteredWorld

gibt ein problem mit dem fire guardian das wir ihn nicht beschören können

im singleplayer getestet geht ohne probleme nur auf em server will er nicht

Mfg Thor

Updated 06/03/2017 08:25 5 Comments

Npc weg porten

combak/ShatteredWorld

relativ hat getestet..mit dem Dislocation Amulett kann mann Spieler die schleichen, Npc, Tiere und alle Monster porten.

Npc finden den weg in ihre Dimenson nie wieder.. bitte Amulett deaktivieren oder oder oder

Updated 14/03/2017 17:08 8 Comments

Bad results with -split_input

marklogic/marklogic-contentpump

I checked out mlcp and built via the instructions at https://github.com/marklogic/marklogic-contentpump

Then unzipped the mlcp bin deliverable and bash $ marklogic-contentpump/mlcp/deliverable/mlcp-9.0-EA3/bin/mlcp.sh version gave bash chamlin@MacPro-3445:bug$ /Users/chamlin/tickets/18184-lds-mlcp/marklogic-contentpump/mlcp/deliverable/mlcp-9.0-EA3/bin/mlcp.sh version ContentPump version: 9.0-EA3 Java version: 1.8.0_51 Hadoop version: 2.6.0 Supported MarkLogic versions: 6.0 - 9.0 Next run bash $ marklogic-contentpump/mlcp/deliverable/mlcp-9.0-EA3/bin/mlcp.sh -options_file test-quotes.txt 17/02/02 21:29:11 INFO contentpump.LocalJobRunner: Content type: XML 17/02/02 21:29:12 INFO contentpump.ContentPump: Job name: local_755095747_1 17/02/02 21:29:12 INFO contentpump.FileAndDirectoryInputFormat: Total input paths to process : 1 17/02/02 21:29:13 INFO contentpump.LocalJobRunner: completed 100% 17/02/02 21:29:13 INFO contentpump.LocalJobRunner: com.marklogic.mapreduce.MarkLogicCounter: 17/02/02 21:29:13 INFO contentpump.LocalJobRunner: INPUT_RECORDS: 900 17/02/02 21:29:13 INFO contentpump.LocalJobRunner: OUTPUT_RECORDS: 900 17/02/02 21:29:13 INFO contentpump.LocalJobRunner: OUTPUT_RECORDS_COMMITTED: 900 17/02/02 21:29:13 INFO contentpump.LocalJobRunner: OUTPUT_RECORDS_FAILED: 0 17/02/02 21:29:13 INFO contentpump.LocalJobRunner: Total execution time: 0 sec This reports 900 records and I see 900 in the database. All is well.

But, if I uncomment the lines ```

-split_input

true

–max_split_size

1000

and rerun I get

marklogic-contentpump/mlcp/deliverable/mlcp-9.0-EA3/bin/mlcp.sh -options_file test-quotes.txt 17/02/02 21:45:30 INFO contentpump.LocalJobRunner: Content type: XML 17/02/02 21:45:30 INFO contentpump.ContentPump: Job name: local_2038694039_1 17/02/02 21:45:30 INFO contentpump.FileAndDirectoryInputFormat: Total input paths to process : 1 17/02/02 21:45:30 INFO contentpump.DelimitedTextInputFormat: 72 DelimitedSplits generated 17/02/02 21:45:31 ERROR contentpump.LocalJobRunner: Error running task: java.lang.RuntimeException: java.io.IOException: (line 2) invalid char between encapsulated token and delimiter at org.apache.commons.csv.CSVParser$1.getNextRecord(CSVParser.java:442) at org.apache.commons.csv.CSVParser$1.hasNext(CSVParser.java:452) at com.marklogic.contentpump.SplitDelimitedTextReader.initParser(SplitDelimitedTextReader.java:205) at com.marklogic.contentpump.SplitDelimitedTextReader.initialize(SplitDelimitedTextReader.java:62) at com.marklogic.contentpump.LocalJobRunner$TrackingRecordReader.initialize(LocalJobRunner.java:439) at com.marklogic.contentpump.LocalJobRunner$LocalMapTask.call(LocalJobRunner.java:373) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.io.IOException: (line 2) invalid char between encapsulated token and delimiter at org.apache.commons.csv.Lexer.parseEncapsulatedToken(Lexer.java:275) at org.apache.commons.csv.Lexer.nextToken(Lexer.java:152) at org.apache.commons.csv.CSVParser.nextRecord(CSVParser.java:498) at org.apache.commons.csv.CSVParser$1.getNextRecord(CSVParser.java:439) … 9 more 17/02/02 21:45:31 INFO contentpump.LocalJobRunner: completed 1% 17/02/02 21:45:31 INFO contentpump.LocalJobRunner: com.marklogic.mapreduce.MarkLogicCounter: 17/02/02 21:45:31 INFO contentpump.LocalJobRunner: INPUT_RECORDS: 910 17/02/02 21:45:31 INFO contentpump.LocalJobRunner: OUTPUT_RECORDS: 910 17/02/02 21:45:31 INFO contentpump.LocalJobRunner: OUTPUT_RECORDS_COMMITTED: 910 17/02/02 21:45:31 INFO contentpump.LocalJobRunner: OUTPUT_RECORDS_FAILED: 0 17/02/02 21:45:31 INFO contentpump.LocalJobRunner: Total execution time: 1 sec ``` and ML shows 888 in the db after the run. So the count shows more, and the db shows less.

I did a little debugging in SplitDelimitedTextReader.java, to show the start/end of the splits. It appears the file is OK, but when the split falls (after moving back one) on the second quote in a line, you get this parse error. It turns out that parserIterator.hasNext() actually reads a record, and the parse fails. The 888 is explained by the records thrown away when the split has the error. I have no idea why the input records are even higher when the split is tossed, certainly there weren’t 910 records commited. No errors showed in the log file.

Running the other file as bash marklogic-contentpump/mlcp/deliverable/mlcp-9.0-EA3/bin/mlcp.sh -options_file test-jp.txt gives bash 17/02/02 21:53:03 INFO contentpump.LocalJobRunner: Content type: XML 17/02/02 21:53:03 INFO contentpump.ContentPump: Job name: local_367057373_1 17/02/02 21:53:03 INFO contentpump.FileAndDirectoryInputFormat: Total input paths to process : 1 17/02/02 21:53:03 INFO contentpump.DelimitedTextInputFormat: 138 DelimitedSplits generated 17/02/02 21:53:04 INFO contentpump.LocalJobRunner: completed 0% 17/02/02 21:53:04 INFO contentpump.LocalJobRunner: com.marklogic.mapreduce.MarkLogicCounter: 17/02/02 21:53:04 INFO contentpump.LocalJobRunner: INPUT_RECORDS: 899 17/02/02 21:53:04 INFO contentpump.LocalJobRunner: OUTPUT_RECORDS: 899 17/02/02 21:53:04 INFO contentpump.LocalJobRunner: OUTPUT_RECORDS_COMMITTED: 899 17/02/02 21:53:04 INFO contentpump.LocalJobRunner: OUTPUT_RECORDS_FAILED: 0 17/02/02 21:53:04 INFO contentpump.LocalJobRunner: Total execution time: 1 sec From debugging, it looks like the problem comes up when the split is at the beginning of a record. Changing the split max size up or down changes the number of records ingested.

The reason I was testing with Japanese text is because it looks like the file seek in SplitDelimitedTextReader is byte-oriented and I wondered if it could land in the middle of a multi-byte UTF-8 encoding of a character. Or, even if there might be a problem with \r\n on Windows systems. bug.zip

Updated 13/02/2017 18:26 2 Comments

Update ServerConfigurationManagerImpl to not use content-versions

marklogic/java-client-api

From the REST guide.

The update-policy property replaces the older content-versions policy; content-versions is deprecated. Setting update-policy to version-optional is equivalent to setting content-versions to optional. Setting update-policy to version-required is equivalent to setting content-versions to required.

The REST admin api currently returns both, but before we remove the deprecated content-versions from the REST admin API we should stop using it in the Java Client API

Updated 30/01/2017 23:00

Page Access

bueltge/Adminimize

I actually needed it to make it so editors could not have access to comments. If they click on direct link to the php file then it works great and blocks them. But if you add options to the link then you can bypass.

WORKS: http://######/wp-admin/edit-comments.php

Doesn’t work http://######/wp-admin/edit-comments.php?comment_status=all

Updated 30/01/2017 13:11

Loading document fails when used with a MetaData object with property set

marklogic/java-client-api

Loading document fails when used with a MetaData object with property set. If “.withProperty("docMeta-1”, “true”).withQuality(1)“ is removed from DocumentMetadataHandle object, the documents are written fine.

    @Test
    public void addWithMetadata() throws Exception{

        final String query1 = "fn:count(fn:doc())";

        try{
            DocumentMetadataHandle meta6 = new DocumentMetadataHandle()
                     .withCollections("Sample Collection 1").withProperty("docMeta-1", "true").withQuality(1);
            meta6.setFormat(Format.XML);
            Assert.assertTrue(dbClient.newServerEval().xquery(query1).eval().next().getNumber().intValue() == 0);

            Thread.currentThread().sleep(5000L);

            DatabaseClient dbClient = DatabaseClientFactory.newClient(host, port, user, password, Authentication.DIGEST);
            DataMovementManager dmManager = dbClient.newDataMovementManager();

            WriteBatcher ihb2 =  dmManager.newWriteBatcher();
            ihb2.withBatchSize(50).withThreadCount(1);

            ihb2.onBatchSuccess(
                    batch -> {
                        }
                    )
                    .onBatchFailure(
                      (batch, throwable) -> {
                          throwable.printStackTrace();

                      }
            );
            for (int j =0 ;j < 1000; j++){
                String uri ="/local/string-"+ j;
                ihb2.addAs(uri , meta6,jsonNode);
            }


            ihb2.flushAndWait();

            Assert.assertTrue(dbClient.newServerEval().xquery(query1).eval().next().getNumber().intValue() == 1000);

        }
        catch(Exception e){
            e.printStackTrace();
        }
    }

Log:



16:32:32.839 [main] DEBUG c.m.client.impl.JerseyServices - Connecting to localhost at 8000 as admin
16:32:33.535 [main] DEBUG c.m.client.impl.JerseyServices - Getting forestinfo as application/json
16:32:33.878 [main] DEBUG c.m.client.impl.JerseyServices - Posting eval
16:32:39.001 [main] DEBUG c.m.client.impl.JerseyServices - Connecting to localhost at 8000 as admin
16:32:39.251 [main] DEBUG c.m.client.impl.JerseyServices - Getting forestinfo as application/json
16:32:39.278 [main] INFO  c.m.c.d.impl.WriteBatcherImpl - (withForestConfig) Using [localhost] hosts with forests for "WriteHostBatcher"
16:32:39.283 [main] INFO  c.m.c.d.impl.WriteBatcherImpl - Adding DatabaseClient on port 8000 for host "localhost" to the rotation
16:32:39.325 [main] INFO  c.m.c.d.impl.WriteBatcherImpl - threadCount=1
16:32:39.325 [main] INFO  c.m.c.d.impl.WriteBatcherImpl - batchSize=50
16:32:39.412 [pool-1-thread-1] DEBUG c.m.client.impl.JerseyServices - Posting documents
16:32:39.413 [pool-1-thread-1] DEBUG c.m.client.impl.JerseyServices - Sending multipart for /v1/documents
16:32:39.419 [main] DEBUG c.m.client.impl.JerseyServices - Posting documents
16:32:39.420 [main] DEBUG c.m.client.impl.JerseyServices - Sending multipart for /v1/documents
16:32:43.731 [pool-1-thread-1] DEBUG c.m.client.impl.JerseyServices - Posting documents
16:32:43.731 [pool-1-thread-1] DEBUG c.m.client.impl.JerseyServices - Sending multipart for /v1/documents
16:32:43.733 [main] DEBUG c.m.client.impl.JerseyServices - Posting documents
16:32:43.734 [main] DEBUG c.m.client.impl.JerseyServices - Sending multipart for /v1/documents
com.sun.jersey.api.client.ClientHandlerException: com.marklogic.client.MarkLogicIOException: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
    at com.sun.jersey.client.apache4.ApacheHttpClient4Handler.handle(ApacheHttpClient4Handler.java:184)
    at com.marklogic.client.impl.DigestChallengeFilter.handle(DigestChallengeFilter.java:34)
    at com.sun.jersey.api.client.filter.HTTPDigestAuthFilter.handle(HTTPDigestAuthFilter.java:493)
    at com.sun.jersey.api.client.Client.handle(Client.java:648)
    at com.sun.jersey.api.client.WebResource.handle(WebResource.java:680)
    at com.sun.jersey.api.client.WebResource.access$200(WebResource.java:74)
    at com.sun.jersey.api.client.WebResource$Builder.post(WebResource.java:568)
    at com.marklogic.client.impl.JerseyServices.doPost(JerseyServices.java:3940)
    at com.marklogic.client.impl.JerseyServices.postResource(JerseyServices.java:3227)
    at com.marklogic.client.impl.JerseyServices.postBulkDocuments(JerseyServices.java:3345)
    at com.marklogic.client.impl.DocumentManagerImpl.write(DocumentManagerImpl.java:619)
    at com.marklogic.client.impl.GenericDocumentImpl.write(GenericDocumentImpl.java:1)
    at com.marklogic.client.impl.DocumentManagerImpl.write(DocumentManagerImpl.java:611)
    at com.marklogic.client.impl.GenericDocumentImpl.write(GenericDocumentImpl.java:1)
    at com.marklogic.client.datamovement.impl.WriteBatcherImpl$BatchWriter.run(WriteBatcherImpl.java:1032)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Caused by: com.marklogic.client.MarkLogicIOException: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
    at com.marklogic.client.io.DocumentMetadataHandle$ValueSerializer.process(DocumentMetadataHandle.java:904)
    at com.marklogic.client.impl.ValueConverter.convertFromJava(ValueConverter.java:270)
    at com.marklogic.client.impl.ValueConverter.convertFromJava(ValueConverter.java:87)
    at com.marklogic.client.io.DocumentMetadataHandle.sendPropertiesImpl(DocumentMetadataHandle.java:873)
    at com.marklogic.client.io.DocumentMetadataHandle.sendMetadataImpl(DocumentMetadataHandle.java:776)
    at com.marklogic.client.io.DocumentMetadataHandle.write(DocumentMetadataHandle.java:559)
    at com.marklogic.client.impl.StreamingOutputImpl.write(StreamingOutputImpl.java:48)
    at com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider.writeTo(StreamingOutputProvider.java:71)
    at com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider.writeTo(StreamingOutputProvider.java:57)
    at com.sun.jersey.multipart.impl.MultiPartWriter.writeTo(MultiPartWriter.java:218)
    at com.sun.jersey.multipart.impl.MultiPartWriter.writeTo(MultiPartWriter.java:71)
    at com.sun.jersey.api.client.RequestWriter$RequestEntityWriterImpl.writeRequestEntity(RequestWriter.java:231)
    at com.sun.jersey.client.apache4.ApacheHttpClient4Handler$2.writeTo(ApacheHttpClient4Handler.java:262)
    at org.apache.http.entity.HttpEntityWrapper.writeTo(HttpEntityWrapper.java:96)
    at org.apache.http.impl.client.EntityEnclosingRequestWrapper$EntityWrapper.writeTo(EntityEnclosingRequestWrapper.java:108)
    at org.apache.http.impl.entity.EntitySerializer.serialize(EntitySerializer.java:120)
    at org.apache.http.impl.AbstractHttpClientConnection.sendRequestEntity(AbstractHttpClientConnection.java:263)
    at org.apache.http.impl.conn.AbstractClientConnAdapter.sendRequestEntity(AbstractClientConnAdapter.java:227)
    at org.apache.http.protocol.HttpRequestExecutor.doSendRequest(HttpRequestExecutor.java:255)
    at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:123)
    at org.apache.http.impl.client.DefaultRequestDirector.tryExecute(DefaultRequestDirector.java:633)
    at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:454)
    at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:820)
    at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:776)
    at com.sun.jersey.client.apache4.ApacheHttpClient4Handler.handle(ApacheHttpClient4Handler.java:170)
    ... 19 more
Caused by: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
    at com.ctc.wstx.sw.BaseStreamWriter.throwOutputError(BaseStreamWriter.java:1564)
    at com.ctc.wstx.sw.RepairingNsStreamWriter.writeAttribute(RepairingNsStreamWriter.java:124)
    at com.marklogic.client.io.DocumentMetadataHandle$ValueSerializer.process(DocumentMetadataHandle.java:900)
    ... 43 more
com.sun.jersey.api.client.ClientHandlerException: com.marklogic.client.MarkLogicIOException: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
    at com.sun.jersey.client.apache4.ApacheHttpClient4Handler.handle(ApacheHttpClient4Handler.java:184)
    at com.marklogic.client.impl.DigestChallengeFilter.handle(DigestChallengeFilter.java:34)
    at com.sun.jersey.api.client.filter.HTTPDigestAuthFilter.handle(HTTPDigestAuthFilter.java:493)
    at com.sun.jersey.api.client.Client.handle(Client.java:648)
    at com.sun.jersey.api.client.WebResource.handle(WebResource.java:680)
    at com.sun.jersey.api.client.WebResource.access$200(WebResource.java:74)
    at com.sun.jersey.api.client.WebResource$Builder.post(WebResource.java:568)
    at com.marklogic.client.impl.JerseyServices.doPost(JerseyServices.java:3940)
    at com.marklogic.client.impl.JerseyServices.postResource(JerseyServices.java:3227)
    at com.marklogic.client.impl.JerseyServices.postBulkDocuments(JerseyServices.java:3345)
    at com.marklogic.client.impl.DocumentManagerImpl.write(DocumentManagerImpl.java:619)
    at com.marklogic.client.impl.GenericDocumentImpl.write(GenericDocumentImpl.java:1)
    at com.marklogic.client.impl.DocumentManagerImpl.write(DocumentManagerImpl.java:611)
    at com.marklogic.client.impl.GenericDocumentImpl.write(GenericDocumentImpl.java:1)
    at com.marklogic.client.datamovement.impl.WriteBatcherImpl$BatchWriter.run(WriteBatcherImpl.java:1032)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Caused by: com.marklogic.client.MarkLogicIOException: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
    at com.marklogic.client.io.DocumentMetadataHandle$ValueSerializer.process(DocumentMetadataHandle.java:904)
    at com.marklogic.client.impl.ValueConverter.convertFromJava(ValueConverter.java:270)
    at com.marklogic.client.impl.ValueConverter.convertFromJava(ValueConverter.java:87)
    at com.marklogic.client.io.DocumentMetadataHandle.sendPropertiesImpl(DocumentMetadataHandle.java:873)
    at com.marklogic.client.io.DocumentMetadataHandle.sendMetadataImpl(DocumentMetadataHandle.java:776)
    at com.marklogic.client.io.DocumentMetadataHandle.write(DocumentMetadataHandle.java:559)
    at com.marklogic.client.impl.StreamingOutputImpl.write(StreamingOutputImpl.java:48)
    at com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider.writeTo(StreamingOutputProvider.java:71)
    at com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider.writeTo(StreamingOutputProvider.java:57)
    at com.sun.jersey.multipart.impl.MultiPartWriter.writeTo(MultiPartWriter.java:218)
    at com.sun.jersey.multipart.impl.MultiPartWriter.writeTo(MultiPartWriter.java:71)
    at com.sun.jersey.api.client.RequestWriter$RequestEntityWriterImpl.writeRequestEntity(RequestWriter.java:231)
    at com.sun.jersey.client.apache4.ApacheHttpClient4Handler$2.writeTo(ApacheHttpClient4Handler.java:262)
    at org.apache.http.entity.HttpEntityWrapper.writeTo(HttpEntityWrapper.java:96)
    at org.apache.http.impl.client.EntityEnclosingRequestWrapper$EntityWrapper.writeTo(EntityEnclosingRequestWrapper.java:108)
    at org.apache.http.impl.entity.EntitySerializer.serialize(EntitySerializer.java:120)
    at org.apache.http.impl.AbstractHttpClientConnection.sendRequestEntity(AbstractHttpClientConnection.java:263)
    at org.apache.http.impl.conn.AbstractClientConnAdapter.sendRequestEntity(AbstractClientConnAdapter.java:227)
    at org.apache.http.protocol.HttpRequestExecutor.doSendRequest(HttpRequestExecutor.java:255)
    at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:123)
    at org.apache.http.impl.client.DefaultRequestDirector.tryExecute(DefaultRequestDirector.java:633)
    at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:454)
    at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:820)
    at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:776)
    at com.sun.jersey.client.apache4.ApacheHttpClient4Handler.handle(ApacheHttpClient4Handler.java:170)
    ... 19 more
Caused by: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
    at com.ctc.wstx.sw.BaseStreamWriter.throwOutputError(BaseStreamWriter.java:1564)
    at com.ctc.wstx.sw.RepairingNsStreamWriter.writeAttribute(RepairingNsStreamWriter.java:124)
    at com.marklogic.client.io.DocumentMetadataHandle$ValueSerializer.process(DocumentMetadataHandle.java:900)
    ... 43 more
com.sun.jersey.api.client.ClientHandlerException: com.marklogic.client.MarkLogicIOException: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
    at com.sun.jersey.client.apache4.ApacheHttpClient4Handler.handle(ApacheHttpClient4Handler.java:184)
    at com.marklogic.client.impl.DigestChallengeFilter.handle(DigestChallengeFilter.java:34)
    at com.sun.jersey.api.client.filter.HTTPDigestAuthFilter.handle(HTTPDigestAuthFilter.java:493)
    at com.sun.jersey.api.client.Client.handle(Client.java:648)
    at com.sun.jersey.api.client.WebResource.handle(WebResource.java:680)
    at com.sun.jersey.api.client.WebResource.access$200(WebResource.java:74)
    at com.sun.jersey.api.client.WebResource$Builder.post(WebResource.java:568)
    at com.marklogic.client.impl.JerseyServices.doPost(JerseyServices.java:3940)
    at com.marklogic.client.impl.JerseyServices.postResource(JerseyServices.java:3227)
    at com.marklogic.client.impl.JerseyServices.postBulkDocuments(JerseyServices.java:3345)
    at com.marklogic.client.impl.DocumentManagerImpl.write(DocumentManagerImpl.java:619)
    at com.marklogic.client.impl.GenericDocumentImpl.write(GenericDocumentImpl.java:1)
    at com.marklogic.client.impl.DocumentManagerImpl.write(DocumentManagerImpl.java:611)
    at com.marklogic.client.impl.GenericDocumentImpl.write(GenericDocumentImpl.java:1)
    at com.marklogic.client.datamovement.impl.WriteBatcherImpl$BatchWriter.run(WriteBatcherImpl.java:1032)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Caused by: com.marklogic.client.MarkLogicIOException: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
    at com.marklogic.client.io.DocumentMetadataHandle$ValueSerializer.process(DocumentMetadataHandle.java:904)
    at com.marklogic.client.impl.ValueConverter.convertFromJava(ValueConverter.java:270)
    at com.marklogic.client.impl.ValueConverter.convertFromJava(ValueConverter.java:87)
    at com.marklogic.client.io.DocumentMetadataHandle.sendPropertiesImpl(DocumentMetadataHandle.java:873)
    at com.marklogic.client.io.DocumentMetadataHandle.sendMetadataImpl(DocumentMetadataHandle.java:776)
    at com.marklogic.client.io.DocumentMetadataHandle.write(DocumentMetadataHandle.java:559)
    at com.marklogic.client.impl.StreamingOutputImpl.write(StreamingOutputImpl.java:48)
    at com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider.writeTo(StreamingOutputProvider.java:71)
    at com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider.writeTo(StreamingOutputProvider.java:57)
    at com.sun.jersey.multipart.impl.MultiPartWriter.writeTo(MultiPartWriter.java:218)
    at com.sun.jersey.multipart.impl.MultiPartWriter.writeTo(MultiPartWriter.java:71)
    at com.sun.jersey.api.client.RequestWriter$RequestEntityWriterImpl.writeRequestEntity(RequestWriter.java:231)
    at com.sun.jersey.client.apache4.ApacheHttpClient4Handler$2.writeTo(ApacheHttpClient4Handler.java:262)
    at org.apache.http.entity.HttpEntityWrapper.writeTo(HttpEntityWrapper.java:96)
    at org.apache.http.impl.client.EntityEnclosingRequestWrapper$EntityWrapper.writeTo(EntityEnclosingRequestWrapper.java:108)
    at org.apache.http.impl.entity.EntitySerializer.serialize(EntitySerializer.java:120)
    at org.apache.http.impl.AbstractHttpClientConnection.sendRequestEntity(AbstractHttpClientConnection.java:263)
    at org.apache.http.impl.conn.AbstractClientConnAdapter.sendRequestEntity(AbstractClientConnAdapter.java:227)
    at org.apache.http.protocol.HttpRequestExecutor.doSendRequest(HttpRequestExecutor.java:255)
    at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:123)
    at org.apache.http.impl.client.DefaultRequestDirector.tryExecute(DefaultRequestDirector.java:633)
    at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:454)
    at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:820)
    at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:776)
    at com.sun.jersey.client.apache4.ApacheHttpClient4Handler.handle(ApacheHttpClient4Handler.java:170)
    ... 19 more
Caused by: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
    at com.ctc.wstx.sw.BaseStreamWriter.throwOutputError(BaseStreamWriter.java:1564)
    at com.ctc.wstx.sw.RepairingNsStreamWriter.writeAttribute(RepairingNsStreamWriter.java:124)
    at com.marklogic.client.io.DocumentMetadataHandle$ValueSerializer.process(DocumentMetadataHandle.java:900)
    ... 43 more
com.sun.jersey.api.client.ClientHandlerException: com.marklogic.client.MarkLogicIOException: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
    at com.sun.jersey.client.apache4.ApacheHttpClient4Handler.handle(ApacheHttpClient4Handler.java:184)
    at com.marklogic.client.impl.DigestChallengeFilter.handle(DigestChallengeFilter.java:34)
    at com.sun.jersey.api.client.filter.HTTPDigestAuthFilter.handle(HTTPDigestAuthFilter.java:493)
    at com.sun.jersey.api.client.Client.handle(Client.java:648)
    at com.sun.jersey.api.client.WebResource.handle(WebResource.java:680)
    at com.sun.jersey.api.client.WebResource.access$200(WebResource.java:74)
    at com.sun.jersey.api.client.WebResource$Builder.post(WebResource.java:568)
    at com.marklogic.client.impl.JerseyServices.doPost(JerseyServices.java:3940)
    at com.marklogic.client.impl.JerseyServices.postResource(JerseyServices.java:3227)
    at com.marklogic.client.impl.JerseyServices.postBulkDocuments(JerseyServices.java:3345)
    at com.marklogic.client.impl.DocumentManagerImpl.write(DocumentManagerImpl.java:619)
    at com.marklogic.client.impl.GenericDocumentImpl.write(GenericDocumentImpl.java:1)
    at com.marklogic.client.impl.DocumentManagerImpl.write(DocumentManagerImpl.java:611)
    at com.marklogic.client.impl.GenericDocumentImpl.write(GenericDocumentImpl.java:1)
    at com.marklogic.client.datamovement.impl.WriteBatcherImpl$BatchWriter.run(WriteBatcherImpl.java:1032)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Caused by: com.marklogic.client.MarkLogicIOException: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
    at com.marklogic.client.io.DocumentMetadataHandle$ValueSerializer.process(DocumentMetadataHandle.java:904)
    at com.marklogic.client.impl.ValueConverter.convertFromJava(ValueConverter.java:270)
    at com.marklogic.client.impl.ValueConverter.convertFromJava(ValueConverter.java:87)
    at com.marklogic.client.io.DocumentMetadataHandle.sendPropertiesImpl(DocumentMetadataHandle.java:873)
    at com.marklogic.client.io.DocumentMetadataHandle.sendMetadataImpl(DocumentMetadataHandle.java:776)
    at com.marklogic.client.io.DocumentMetadataHandle.write(DocumentMetadataHandle.java:559)
    at com.marklogic.client.impl.StreamingOutputImpl.write(StreamingOutputImpl.java:48)
    at com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider.writeTo(StreamingOutputProvider.java:71)
    at com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider.writeTo(StreamingOutputProvider.java:57)
    at com.sun.jersey.multipart.impl.MultiPartWriter.writeTo(MultiPartWriter.java:218)
    at com.sun.jersey.multipart.impl.MultiPartWriter.writeTo(MultiPartWriter.java:71)
    at com.sun.jersey.api.client.RequestWriter$RequestEntityWriterImpl.writeRequestEntity(RequestWriter.java:231)
    at com.sun.jersey.client.apache4.ApacheHttpClient4Handler$2.writeTo(ApacheHttpClient4Handler.java:262)
    at org.apache.http.entity.HttpEntityWrapper.writeTo(HttpEntityWrapper.java:96)
    at org.apache.http.impl.client.EntityEnclosingRequestWrapper$EntityWrapper.writeTo(EntityEnclosingRequestWrapper.java:108)
    at org.apache.http.impl.entity.EntitySerializer.serialize(EntitySerializer.java:120)
    at org.apache.http.impl.AbstractHttpClientConnection.sendRequestEntity(AbstractHttpClientConnection.java:263)
    at org.apache.http.impl.conn.AbstractClientConnAdapter.sendRequestEntity(AbstractClientConnAdapter.java:227)
    at org.apache.http.protocol.HttpRequestExecutor.doSendRequest(HttpRequestExecutor.java:255)
    at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:123)
    at org.apache.http.impl.client.DefaultRequestDirector.tryExecute(DefaultRequestDirector.java:633)
    at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:454)
    at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:820)
    at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:776)
    at com.sun.jersey.client.apache4.ApacheHttpClient4Handler.handle(ApacheHttpClient4Handler.java:170)
    ... 19 more
Caused by: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
    at com.ctc.wstx.sw.BaseStreamWriter.throwOutputError(BaseStreamWriter.java:1564)
    at com.ctc.wstx.sw.RepairingNsStreamWriter.writeAttribute(RepairingNsStreamWriter.java:124)
    at com.marklogic.client.io.DocumentMetadataHandle$ValueSerializer.process(DocumentMetadataHandle.java:900)
    ... 43 more
16:32:43.746 [pool-1-thread-1] WARN  c.m.c.d.impl.WriteBatcherImpl - Error writing batch: com.sun.jersey.api.client.ClientHandlerException: com.marklogic.client.MarkLogicIOException: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
16:32:43.748 [pool-1-thread-1] DEBUG c.m.client.impl.JerseyServices - Posting documents
16:32:43.748 [pool-1-thread-1] DEBUG c.m.client.impl.JerseyServices - Sending multipart for /v1/documents
16:32:43.756 [pool-1-thread-1] WARN  c.m.c.d.impl.WriteBatcherImpl - Error writing batch: com.sun.jersey.api.client.ClientHandlerException: com.marklogic.client.MarkLogicIOException: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
16:32:43.758 [pool-1-thread-1] DEBUG c.m.client.impl.JerseyServices - Posting documents
16:32:43.758 [pool-1-thread-1] DEBUG c.m.client.impl.JerseyServices - Sending multipart for /v1/documents
16:32:43.767 [pool-1-thread-1] WARN  c.m.c.d.impl.WriteBatcherImpl - Error writing batch: com.sun.jersey.api.client.ClientHandlerException: com.marklogic.client.MarkLogicIOException: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
16:32:43.769 [pool-1-thread-1] DEBUG c.m.client.impl.JerseyServices - Posting documents
16:32:43.770 [pool-1-thread-1] DEBUG c.m.client.impl.JerseyServices - Sending multipart for /v1/documents
16:32:43.783 [pool-1-thread-1] WARN  c.m.c.d.impl.WriteBatcherImpl - Error writing batch: com.sun.jersey.api.client.ClientHandlerException: com.marklogic.client.MarkLogicIOException: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
16:32:43.785 [pool-1-thread-1] DEBUG c.m.client.impl.JerseyServices - Posting documents
16:32:43.786 [pool-1-thread-1] DEBUG c.m.client.impl.JerseyServices - Sending multipart for /v1/documents
16:32:44.198 [pool-1-thread-1] DEBUG c.m.client.impl.JerseyServices - Posting documents
16:32:44.199 [pool-1-thread-1] DEBUG c.m.client.impl.JerseyServices - Sending multipart for /v1/documents
16:32:44.753 [main] WARN  c.m.c.d.impl.WriteBatcherImpl - Error writing batch: com.marklogic.client.FailedRequestException: Local message: failed to apply resource at documents: Bad Request. Server Message: XDMP-DOCSTARTTAGCHAR: xdmp:get-request-part-body("xml") -- Unexpected character "&quot;" in start tag at  line 1
com.marklogic.client.FailedRequestException: Local message: failed to apply resource at documents: Bad Request. Server Message: XDMP-DOCSTARTTAGCHAR: xdmp:get-request-part-body("xml") -- Unexpected character "&quot;" in start tag at  line 1
    at com.marklogic.client.impl.JerseyServices.checkStatus(JerseyServices.java:4243)
    at com.marklogic.client.impl.JerseyServices.postResource(JerseyServices.java:3257)
    at com.marklogic.client.impl.JerseyServices.postBulkDocuments(JerseyServices.java:3345)
    at com.marklogic.client.impl.DocumentManagerImpl.write(DocumentManagerImpl.java:619)
    at com.marklogic.client.impl.GenericDocumentImpl.write(GenericDocumentImpl.java:1)
    at com.marklogic.client.impl.DocumentManagerImpl.write(DocumentManagerImpl.java:611)
    at com.marklogic.client.impl.GenericDocumentImpl.write(GenericDocumentImpl.java:1)
    at com.marklogic.client.datamovement.impl.WriteBatcherImpl$BatchWriter.run(WriteBatcherImpl.java:1032)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor$CallerRunsPolicy.rejectedExecution(ThreadPoolExecutor.java:2022)
    at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:823)
    at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1369)
    at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:112)
    at com.marklogic.client.datamovement.impl.WriteBatcherImpl.add(WriteBatcherImpl.java:292)
    at com.marklogic.client.datamovement.impl.WriteBatcherImpl.addAs(WriteBatcherImpl.java:320)
    at com.marklogic.client.datamovement.functionaltests.WriteHostBatcherTest.addWithMetadata(WriteHostBatcherTest.java:2458)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
    at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
    at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
    at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
    at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
    at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
    at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
    at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
    at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
    at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
    at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
    at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
    at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
    at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:86)
    at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
    at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:459)
    at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:675)
    at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:382)
    at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:192)
16:32:44.756 [main] DEBUG c.m.client.impl.JerseyServices - Posting documents
16:32:44.756 [main] DEBUG c.m.client.impl.JerseyServices - Sending multipart for /v1/documents
16:32:44.768 [pool-1-thread-1] DEBUG c.m.client.impl.JerseyServices - Posting documents
16:32:44.768 [pool-1-thread-1] DEBUG c.m.client.impl.JerseyServices - Sending multipart for /v1/documents
com.marklogic.client.FailedRequestException: Local message: failed to apply resource at documents: Bad Request. Server Message: XDMP-DOCSTARTTAGCHAR: xdmp:get-request-part-body("xml") -- Unexpected character "&quot;" in start tag at  line 1
16:32:44.852 [main] WARN  c.m.c.d.impl.WriteBatcherImpl - Error writing batch: com.marklogic.client.FailedRequestException: Local message: failed to apply resource at documents: Bad Request. Server Message: XDMP-DOCSTARTTAGCHAR: xdmp:get-request-part-body("xml") -- Unexpected character "&quot;" in start tag at  line 1
16:32:44.854 [main] DEBUG c.m.client.impl.JerseyServices - Posting documents
16:32:44.854 [main] DEBUG c.m.client.impl.JerseyServices - Sending multipart for /v1/documents
    at com.marklogic.client.impl.JerseyServices.checkStatus(JerseyServices.java:4243)
    at com.marklogic.client.impl.JerseyServices.postResource(JerseyServices.java:3257)
    at com.marklogic.client.impl.JerseyServices.postBulkDocuments(JerseyServices.java:3345)
    at com.marklogic.client.impl.DocumentManagerImpl.write(DocumentManagerImpl.java:619)
    at com.marklogic.client.impl.GenericDocumentImpl.write(GenericDocumentImpl.java:1)
    at com.marklogic.client.impl.DocumentManagerImpl.write(DocumentManagerImpl.java:611)
    at com.marklogic.client.impl.GenericDocumentImpl.write(GenericDocumentImpl.java:1)
    at com.marklogic.client.datamovement.impl.WriteBatcherImpl$BatchWriter.run(WriteBatcherImpl.java:1032)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor$CallerRunsPolicy.rejectedExecution(ThreadPoolExecutor.java:2022)
    at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:823)
    at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1369)
    at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:112)
    at com.marklogic.client.datamovement.impl.WriteBatcherImpl.add(WriteBatcherImpl.java:292)
    at com.marklogic.client.datamovement.impl.WriteBatcherImpl.addAs(WriteBatcherImpl.java:320)
    at com.marklogic.client.datamovement.functionaltests.WriteHostBatcherTest.addWithMetadata(WriteHostBatcherTest.java:2458)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
    at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
    at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
    at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
    at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
    at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
    at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
    at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
    at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
    at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
    at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
    at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
    at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
    at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:86)
    at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
    at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:459)
    at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:675)
    at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:382)
    at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:192)
com.marklogic.client.FailedRequestException: Local message: failed to apply resource at documents: Bad Request. Server Message: XDMP-DOCSTARTTAGCHAR: xdmp:get-request-part-body("xml") -- Unexpected character "&quot;" in start tag at  line 1
    at com.marklogic.client.impl.JerseyServices.checkStatus(JerseyServices.java:4243)
    at com.marklogic.client.impl.JerseyServices.postResource(JerseyServices.java:3257)
    at com.marklogic.client.impl.JerseyServices.postBulkDocuments(JerseyServices.java:3345)
    at com.marklogic.client.impl.DocumentManagerImpl.write(DocumentManagerImpl.java:619)
    at com.marklogic.client.impl.GenericDocumentImpl.write(GenericDocumentImpl.java:1)
    at com.marklogic.client.impl.DocumentManagerImpl.write(DocumentManagerImpl.java:611)
    at com.marklogic.client.impl.GenericDocumentImpl.write(GenericDocumentImpl.java:1)
    at com.marklogic.client.datamovement.impl.WriteBatcherImpl$BatchWriter.run(WriteBatcherImpl.java:1032)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
16:32:44.859 [pool-1-thread-1] WARN  c.m.c.d.impl.WriteBatcherImpl - Error writing batch: com.marklogic.client.FailedRequestException: Local message: failed to apply resource at documents: Bad Request. Server Message: XDMP-DOCSTARTTAGCHAR: xdmp:get-request-part-body("xml") -- Unexpected character "&quot;" in start tag at  line 1
16:32:44.860 [pool-1-thread-1] DEBUG c.m.client.impl.JerseyServices - Posting documents
16:32:44.860 [pool-1-thread-1] DEBUG c.m.client.impl.JerseyServices - Sending multipart for /v1/documents
com.sun.jersey.api.client.ClientHandlerException: com.marklogic.client.MarkLogicIOException: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
    at com.sun.jersey.client.apache4.ApacheHttpClient4Handler.handle(ApacheHttpClient4Handler.java:184)
    at com.marklogic.client.impl.DigestChallengeFilter.handle(DigestChallengeFilter.java:34)
    at com.sun.jersey.api.client.filter.HTTPDigestAuthFilter.handle(HTTPDigestAuthFilter.java:493)
    at com.sun.jersey.api.client.Client.handle(Client.java:648)
    at com.sun.jersey.api.client.WebResource.handle(WebResource.java:680)
    at com.sun.jersey.api.client.WebResource.access$200(WebResource.java:74)
    at com.sun.jersey.api.client.WebResource$Builder.post(WebResource.java:568)
    at com.marklogic.client.impl.JerseyServices.doPost(JerseyServices.java:3940)
    at com.marklogic.client.impl.JerseyServices.postResource(JerseyServices.java:3227)
    at com.marklogic.client.impl.JerseyServices.postBulkDocuments(JerseyServices.java:3345)
    at com.marklogic.client.impl.DocumentManagerImpl.write(DocumentManagerImpl.java:619)
    at com.marklogic.client.impl.GenericDocumentImpl.write(GenericDocumentImpl.java:1)
    at com.marklogic.client.impl.DocumentManagerImpl.write(DocumentManagerImpl.java:611)
    at com.marklogic.client.impl.GenericDocumentImpl.write(GenericDocumentImpl.java:1)
    at com.marklogic.client.datamovement.impl.WriteBatcherImpl$BatchWriter.run(WriteBatcherImpl.java:1032)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Caused by: com.marklogic.client.MarkLogicIOException: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
    at com.marklogic.client.io.DocumentMetadataHandle$ValueSerializer.process(DocumentMetadataHandle.java:904)
    at com.marklogic.client.impl.ValueConverter.convertFromJava(ValueConverter.java:270)
    at com.marklogic.client.impl.ValueConverter.convertFromJava(ValueConverter.java:87)
    at com.marklogic.client.io.DocumentMetadataHandle.sendPropertiesImpl(DocumentMetadataHandle.java:873)
    at com.marklogic.client.io.DocumentMetadataHandle.sendMetadataImpl(DocumentMetadataHandle.java:776)
    at com.marklogic.client.io.DocumentMetadataHandle.write(DocumentMetadataHandle.java:559)
    at com.marklogic.client.impl.StreamingOutputImpl.write(StreamingOutputImpl.java:48)
    at com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider.writeTo(StreamingOutputProvider.java:71)
    at com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider.writeTo(StreamingOutputProvider.java:57)
    at com.sun.jersey.multipart.impl.MultiPartWriter.writeTo(MultiPartWriter.java:218)
    at com.sun.jersey.multipart.impl.MultiPartWriter.writeTo(MultiPartWriter.java:71)
    at com.sun.jersey.api.client.RequestWriter$RequestEntityWriterImpl.writeRequestEntity(RequestWriter.java:231)
    at com.sun.jersey.client.apache4.ApacheHttpClient4Handler$2.writeTo(ApacheHttpClient4Handler.java:262)
    at org.apache.http.entity.HttpEntityWrapper.writeTo(HttpEntityWrapper.java:96)
    at org.apache.http.impl.client.EntityEnclosingRequestWrapper$EntityWrapper.writeTo(EntityEnclosingRequestWrapper.java:108)
    at org.apache.http.impl.entity.EntitySerializer.serialize(EntitySerializer.java:120)
    at org.apache.http.impl.AbstractHttpClientConnection.sendRequestEntity(AbstractHttpClientConnection.java:263)
    at org.apache.http.impl.conn.AbstractClientConnAdapter.sendRequestEntity(AbstractClientConnAdapter.java:227)
    at org.apache.http.protocol.HttpRequestExecutor.doSendRequest(HttpRequestExecutor.java:255)
    at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:123)
    at org.apache.http.impl.client.DefaultRequestDirector.tryExecute(DefaultRequestDirector.java:633)
    at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:454)
    at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:820)
    at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:776)
    at com.sun.jersey.client.apache4.ApacheHttpClient4Handler.handle(ApacheHttpClient4Handler.java:170)
    ... 19 more
Caused by: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
    at com.ctc.wstx.sw.BaseStreamWriter.throwOutputError(BaseStreamWriter.java:1564)
    at com.ctc.wstx.sw.RepairingNsStreamWriter.writeAttribute(RepairingNsStreamWriter.java:124)
    at com.marklogic.client.io.DocumentMetadataHandle$ValueSerializer.process(DocumentMetadataHandle.java:900)
    ... 43 more
com.sun.jersey.api.client.ClientHandlerException: com.marklogic.client.MarkLogicIOException: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
    at com.sun.jersey.client.apache4.ApacheHttpClient4Handler.handle(ApacheHttpClient4Handler.java:184)
    at com.marklogic.client.impl.DigestChallengeFilter.handle(DigestChallengeFilter.java:34)
    at com.sun.jersey.api.client.filter.HTTPDigestAuthFilter.handle(HTTPDigestAuthFilter.java:493)
    at com.sun.jersey.api.client.Client.handle(Client.java:648)
    at com.sun.jersey.api.client.WebResource.handle(WebResource.java:680)
    at com.sun.jersey.api.client.WebResource.access$200(WebResource.java:74)
    at com.sun.jersey.api.client.WebResource$Builder.post(WebResource.java:568)
    at com.marklogic.client.impl.JerseyServices.doPost(JerseyServices.java:3940)
    at com.marklogic.client.impl.JerseyServices.postResource(JerseyServices.java:3227)
    at com.marklogic.client.impl.JerseyServices.postBulkDocuments(JerseyServices.java:3345)
    at com.marklogic.client.impl.DocumentManagerImpl.write(DocumentManagerImpl.java:619)
    at com.marklogic.client.impl.GenericDocumentImpl.write(GenericDocumentImpl.java:1)
    at com.marklogic.client.impl.DocumentManagerImpl.write(DocumentManagerImpl.java:611)
    at com.marklogic.client.impl.GenericDocumentImpl.write(GenericDocumentImpl.java:1)
    at com.marklogic.client.datamovement.impl.WriteBatcherImpl$BatchWriter.run(WriteBatcherImpl.java:1032)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor$CallerRunsPolicy.rejectedExecution(ThreadPoolExecutor.java:2022)
    at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:823)
    at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1369)
    at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:112)
    at com.marklogic.client.datamovement.impl.WriteBatcherImpl.add(WriteBatcherImpl.java:292)
    at com.marklogic.client.datamovement.impl.WriteBatcherImpl.addAs(WriteBatcherImpl.java:320)
    at com.marklogic.client.datamovement.functionaltests.WriteHostBatcherTest.addWithMetadata(WriteHostBatcherTest.java:2458)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
    at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
    at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
    at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
    at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
    at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
    at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
    at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
    at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
    at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
    at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
    at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
    at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
    at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:86)
    at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
    at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:459)
    at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:675)
    at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:382)
    at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:192)
Caused by: com.marklogic.client.MarkLogicIOException: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
    at com.marklogic.client.io.DocumentMetadataHandle$ValueSerializer.process(DocumentMetadataHandle.java:904)
    at com.marklogic.client.impl.ValueConverter.convertFromJava(ValueConverter.java:270)
    at com.marklogic.client.impl.ValueConverter.convertFromJava(ValueConverter.java:87)
    at com.marklogic.client.io.DocumentMetadataHandle.sendPropertiesImpl(DocumentMetadataHandle.java:873)
    at com.marklogic.client.io.DocumentMetadataHandle.sendMetadataImpl(DocumentMetadataHandle.java:776)
    at com.marklogic.client.io.DocumentMetadataHandle.write(DocumentMetadataHandle.java:559)
    at com.marklogic.client.impl.StreamingOutputImpl.write(StreamingOutputImpl.java:48)
    at com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider.writeTo(StreamingOutputProvider.java:71)
    at com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider.writeTo(StreamingOutputProvider.java:57)
    at com.sun.jersey.multipart.impl.MultiPartWriter.writeTo(MultiPartWriter.java:218)
    at com.sun.jersey.multipart.impl.MultiPartWriter.writeTo(MultiPartWriter.java:71)
    at com.sun.jersey.api.client.RequestWriter$RequestEntityWriterImpl.writeRequestEntity(RequestWriter.java:231)
    at com.sun.jersey.client.apache4.ApacheHttpClient4Handler$2.writeTo(ApacheHttpClient4Handler.java:262)
    at org.apache.http.entity.HttpEntityWrapper.writeTo(HttpEntityWrapper.java:96)
    at org.apache.http.impl.client.EntityEnclosingRequestWrapper$EntityWrapper.writeTo(EntityEnclosingRequestWrapper.java:108)
    at org.apache.http.impl.entity.EntitySerializer.serialize(EntitySerializer.java:120)
    at org.apache.http.impl.AbstractHttpClientConnection.sendRequestEntity(AbstractHttpClientConnection.java:263)
    at org.apache.http.impl.conn.AbstractClientConnAdapter.sendRequestEntity(AbstractClientConnAdapter.java:227)
    at org.apache.http.protocol.HttpRequestExecutor.doSendRequest(HttpRequestExecutor.java:255)
    at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:123)
    at org.apache.http.impl.client.DefaultRequestDirector.tryExecute(DefaultRequestDirector.java:633)
    at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:454)
    at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:820)
    at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:776)
    at com.sun.jersey.client.apache4.ApacheHttpClient4Handler.handle(ApacheHttpClient4Handler.java:170)
    ... 50 more
Caused by: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
    at com.ctc.wstx.sw.BaseStreamWriter.throwOutputError(BaseStreamWriter.java:1564)
    at com.ctc.wstx.sw.RepairingNsStreamWriter.writeAttribute(RepairingNsStreamWriter.java:124)
    at com.marklogic.client.io.DocumentMetadataHandle$ValueSerializer.process(DocumentMetadataHandle.java:900)
    ... 74 more
16:32:44.887 [pool-1-thread-1] WARN  c.m.c.d.impl.WriteBatcherImpl - Error writing batch: com.sun.jersey.api.client.ClientHandlerException: com.marklogic.client.MarkLogicIOException: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
16:32:44.888 [pool-1-thread-1] DEBUG c.m.client.impl.JerseyServices - Posting documents
16:32:44.888 [pool-1-thread-1] DEBUG c.m.client.impl.JerseyServices - Sending multipart for /v1/documents
16:32:44.899 [main] WARN  c.m.c.d.impl.WriteBatcherImpl - Error writing batch: com.sun.jersey.api.client.ClientHandlerException: com.marklogic.client.MarkLogicIOException: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
16:32:44.900 [main] DEBUG c.m.client.impl.JerseyServices - Posting documents
16:32:44.901 [main] DEBUG c.m.client.impl.JerseyServices - Sending multipart for /v1/documents
16:32:44.936 [pool-1-thread-1] WARN  c.m.c.d.impl.WriteBatcherImpl - Error writing batch: com.sun.jersey.api.client.ClientHandlerException: com.marklogic.client.MarkLogicIOException: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
com.sun.jersey.api.client.ClientHandlerException: com.marklogic.client.MarkLogicIOException: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
    at com.sun.jersey.client.apache4.ApacheHttpClient4Handler.handle(ApacheHttpClient4Handler.java:184)
    at com.marklogic.client.impl.DigestChallengeFilter.handle(DigestChallengeFilter.java:34)
    at com.sun.jersey.api.client.filter.HTTPDigestAuthFilter.handle(HTTPDigestAuthFilter.java:493)
    at com.sun.jersey.api.client.Client.handle(Client.java:648)
    at com.sun.jersey.api.client.WebResource.handle(WebResource.java:680)
    at com.sun.jersey.api.client.WebResource.access$200(WebResource.java:74)
    at com.sun.jersey.api.client.WebResource$Builder.post(WebResource.java:568)
    at com.marklogic.client.impl.JerseyServices.doPost(JerseyServices.java:3940)
    at com.marklogic.client.impl.JerseyServices.postResource(JerseyServices.java:3227)
    at com.marklogic.client.impl.JerseyServices.postBulkDocuments(JerseyServices.java:3345)
    at com.marklogic.client.impl.DocumentManagerImpl.write(DocumentManagerImpl.java:619)
    at com.marklogic.client.impl.GenericDocumentImpl.write(GenericDocumentImpl.java:1)
    at com.marklogic.client.impl.DocumentManagerImpl.write(DocumentManagerImpl.java:611)
    at com.marklogic.client.impl.GenericDocumentImpl.write(GenericDocumentImpl.java:1)
    at com.marklogic.client.datamovement.impl.WriteBatcherImpl$BatchWriter.run(WriteBatcherImpl.java:1032)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Caused by: com.marklogic.client.MarkLogicIOException: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
    at com.marklogic.client.io.DocumentMetadataHandle$ValueSerializer.process(DocumentMetadataHandle.java:904)
    at com.marklogic.client.impl.ValueConverter.convertFromJava(ValueConverter.java:270)
    at com.marklogic.client.impl.ValueConverter.convertFromJava(ValueConverter.java:87)
    at com.marklogic.client.io.DocumentMetadataHandle.sendPropertiesImpl(DocumentMetadataHandle.java:873)
    at com.marklogic.client.io.DocumentMetadataHandle.sendMetadataImpl(DocumentMetadataHandle.java:776)
    at com.marklogic.client.io.DocumentMetadataHandle.write(DocumentMetadataHandle.java:559)
    at com.marklogic.client.impl.StreamingOutputImpl.write(StreamingOutputImpl.java:48)
    at com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider.writeTo(StreamingOutputProvider.java:71)
    at com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider.writeTo(StreamingOutputProvider.java:57)
    at com.sun.jersey.multipart.impl.MultiPartWriter.writeTo(MultiPartWriter.java:218)
    at com.sun.jersey.multipart.impl.MultiPartWriter.writeTo(MultiPartWriter.java:71)
    at com.sun.jersey.api.client.RequestWriter$RequestEntityWriterImpl.writeRequestEntity(RequestWriter.java:231)
    at com.sun.jersey.client.apache4.ApacheHttpClient4Handler$2.writeTo(ApacheHttpClient4Handler.java:262)
    at org.apache.http.entity.HttpEntityWrapper.writeTo(HttpEntityWrapper.java:96)
    at org.apache.http.impl.client.EntityEnclosingRequestWrapper$EntityWrapper.writeTo(EntityEnclosingRequestWrapper.java:108)
    at org.apache.http.impl.entity.EntitySerializer.serialize(EntitySerializer.java:120)
    at org.apache.http.impl.AbstractHttpClientConnection.sendRequestEntity(AbstractHttpClientConnection.java:263)
    at org.apache.http.impl.conn.AbstractClientConnAdapter.sendRequestEntity(AbstractClientConnAdapter.java:227)
    at org.apache.http.protocol.HttpRequestExecutor.doSendRequest(HttpRequestExecutor.java:255)
    at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:123)
    at org.apache.http.impl.client.DefaultRequestDirector.tryExecute(DefaultRequestDirector.java:633)
    at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:454)
    at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:820)
    at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:776)
    at com.sun.jersey.client.apache4.ApacheHttpClient4Handler.handle(ApacheHttpClient4Handler.java:170)
    ... 19 more
Caused by: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
    at com.ctc.wstx.sw.BaseStreamWriter.throwOutputError(BaseStreamWriter.java:1564)
    at com.ctc.wstx.sw.RepairingNsStreamWriter.writeAttribute(RepairingNsStreamWriter.java:124)
    at com.marklogic.client.io.DocumentMetadataHandle$ValueSerializer.process(DocumentMetadataHandle.java:900)
    ... 43 more
16:32:44.937 [pool-1-thread-1] DEBUG c.m.client.impl.JerseyServices - Posting documents
16:32:44.937 [pool-1-thread-1] DEBUG c.m.client.impl.JerseyServices - Sending multipart for /v1/documents
16:32:45.127 [main] INFO  c.m.c.d.impl.WriteBatcherImpl - flushing 0 queued docs
16:32:45.129 [pool-3-thread-1] DEBUG c.m.client.impl.JerseyServices - Posting documents
16:32:45.130 [pool-3-thread-1] DEBUG c.m.client.impl.JerseyServices - Sending multipart for /v1/documents
16:32:45.160 [pool-1-thread-1] DEBUG c.m.client.impl.JerseyServices - Posting documents
16:32:45.160 [pool-1-thread-1] DEBUG c.m.client.impl.JerseyServices - Sending multipart for /v1/documents
com.sun.jersey.api.client.ClientHandlerException: com.marklogic.client.MarkLogicIOException: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
    at com.sun.jersey.client.apache4.ApacheHttpClient4Handler.handle(ApacheHttpClient4Handler.java:184)
    at com.marklogic.client.impl.DigestChallengeFilter.handle(DigestChallengeFilter.java:34)
    at com.sun.jersey.api.client.filter.HTTPDigestAuthFilter.handle(HTTPDigestAuthFilter.java:493)
    at com.sun.jersey.api.client.filter.HTTPDigestAuthFilter.handle(HTTPDigestAuthFilter.java:539)
    at com.sun.jersey.api.client.Client.handle(Client.java:648)
    at com.sun.jersey.api.client.WebResource.handle(WebResource.java:680)
    at com.sun.jersey.api.client.WebResource.access$200(WebResource.java:74)
    at com.sun.jersey.api.client.WebResource$Builder.post(WebResource.java:568)
    at com.marklogic.client.impl.JerseyServices.doPost(JerseyServices.java:3940)
    at com.marklogic.client.impl.JerseyServices.postResource(JerseyServices.java:3227)
    at com.marklogic.client.impl.JerseyServices.postBulkDocuments(JerseyServices.java:3345)
    at com.marklogic.client.impl.DocumentManagerImpl.write(DocumentManagerImpl.java:619)
    at com.marklogic.client.impl.GenericDocumentImpl.write(GenericDocumentImpl.java:1)
    at com.marklogic.client.impl.DocumentManagerImpl.write(DocumentManagerImpl.java:611)
    at com.marklogic.client.impl.GenericDocumentImpl.write(GenericDocumentImpl.java:1)
    at com.marklogic.client.datamovement.impl.WriteBatcherImpl$BatchWriter.run(WriteBatcherImpl.java:1032)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at com.marklogic.client.datamovement.impl.WriteBatcherImpl.lambda$5(WriteBatcherImpl.java:658)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Caused by: com.marklogic.client.MarkLogicIOException: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
    at com.marklogic.client.io.DocumentMetadataHandle$ValueSerializer.process(DocumentMetadataHandle.java:904)
    at com.marklogic.client.impl.ValueConverter.convertFromJava(ValueConverter.java:270)
    at com.marklogic.client.impl.ValueConverter.convertFromJava(ValueConverter.java:87)
    at com.marklogic.client.io.DocumentMetadataHandle.sendPropertiesImpl(DocumentMetadataHandle.java:873)
    at com.marklogic.client.io.DocumentMetadataHandle.sendMetadataImpl(DocumentMetadataHandle.java:776)
    at com.marklogic.client.io.DocumentMetadataHandle.write(DocumentMetadataHandle.java:559)
    at com.marklogic.client.impl.StreamingOutputImpl.write(StreamingOutputImpl.java:48)
    at com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider.writeTo(StreamingOutputProvider.java:71)
    at com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider.writeTo(StreamingOutputProvider.java:57)
    at com.sun.jersey.multipart.impl.MultiPartWriter.writeTo(MultiPartWriter.java:218)
    at com.sun.jersey.multipart.impl.MultiPartWriter.writeTo(MultiPartWriter.java:71)
    at com.sun.jersey.api.client.RequestWriter$RequestEntityWriterImpl.writeRequestEntity(RequestWriter.java:231)
    at com.sun.jersey.client.apache4.ApacheHttpClient4Handler$2.writeTo(ApacheHttpClient4Handler.java:262)
    at org.apache.http.entity.HttpEntityWrapper.writeTo(HttpEntityWrapper.java:96)
    at org.apache.http.impl.client.EntityEnclosingRequestWrapper$EntityWrapper.writeTo(EntityEnclosingRequestWrapper.java:108)
    at org.apache.http.impl.entity.EntitySerializer.serialize(EntitySerializer.java:120)
    at org.apache.http.impl.AbstractHttpClientConnection.sendRequestEntity(AbstractHttpClientConnection.java:263)
    at org.apache.http.impl.conn.AbstractClientConnAdapter.sendRequestEntity(AbstractClientConnAdapter.java:227)
    at org.apache.http.protocol.HttpRequestExecutor.doSendRequest(HttpRequestExecutor.java:255)
    at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:123)
    at org.apache.http.impl.client.DefaultRequestDirector.tryExecute(DefaultRequestDirector.java:633)
    at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:454)
    at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:820)
    at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:776)
    at com.sun.jersey.client.apache4.ApacheHttpClient4Handler.handle(ApacheHttpClient4Handler.java:170)
    ... 22 more
Caused by: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
    at com.ctc.wstx.sw.BaseStreamWriter.throwOutputError(BaseStreamWriter.java:1564)
    at com.ctc.wstx.sw.RepairingNsStreamWriter.writeAttribute(RepairingNsStreamWriter.java:124)
    at com.marklogic.client.io.DocumentMetadataHandle$ValueSerializer.process(DocumentMetadataHandle.java:900)
    ... 46 more
16:32:45.169 [pool-3-thread-1] WARN  c.m.c.d.impl.WriteBatcherImpl - Error writing batch: com.sun.jersey.api.client.ClientHandlerException: com.marklogic.client.MarkLogicIOException: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
16:32:45.170 [pool-3-thread-1] DEBUG c.m.client.impl.JerseyServices - Posting documents
16:32:45.170 [pool-3-thread-1] DEBUG c.m.client.impl.JerseyServices - Sending multipart for /v1/documents
com.sun.jersey.api.client.ClientHandlerException: com.marklogic.client.MarkLogicIOException: javax.xml.stream.XMLStreamException: Trying to output non-whitespace characters outside main element tree (in prolog or epilog)
    at com.sun.jersey.client.apache4.ApacheHttpClient4Handler.handle(ApacheHttpClient4Handler.java:184)
    at com.marklogic.client.impl.DigestChallengeFilter.handle(DigestChallengeFilter.java:34)
    at com.sun.jersey.api.client.filter.HTTPDigestAuthFilter.handle(HTTPDigestAuthFilter.java:493)
    at com.sun.jersey.api.client.Client.handle(Client.java:648)
    at com.sun.jersey.api.client.WebResource.handle(WebResource.java:680)
    at com.sun.jersey.api.client.WebResource.access$200(WebResource.java:74)
    at com.sun.jersey.api.client.WebResource$Builder.post(WebResource.java:568)
    at com.marklogic.client.impl.JerseyServices.doPost(JerseyServices.java:3940)
    at com.marklogic.client.impl.JerseyServices.postResource(JerseyServices.java:3227)
    at com.marklogic.client.impl.JerseyServices.postBulkDocuments(JerseyServices.java:3345)
    at com.marklogic.client.impl.DocumentManagerImpl.write(DocumentManagerImpl.java:619)
    at com.marklogic.client.impl.GenericDocumentImpl.write(GenericDocumentImpl.java:1)
    at com.marklogic.client.impl.DocumentManagerImpl.write(DocumentManagerImpl.java:611)
    at com.marklogic.client.impl.GenericDocumentImpl.write(GenericDocumentImpl.java:1)
    at com.marklogic.client.datamovement.impl.WriteBatcherImpl$BatchWriter.run(WriteBatcherImpl.java:1032)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Caused by: com.marklogic.client.MarkLogicIOException: javax.xml.stream.XMLStreamException: Trying to output non-whitespace characters outside main element tree (in prolog or epilog)
    at com.marklogic.client.io.DocumentMetadataHandle$ValueSerializer.process(DocumentMetadataHandle.java:904)
    at com.marklogic.client.impl.ValueConverter.convertFromJava(ValueConverter.java:270)
    at com.marklogic.client.impl.ValueConverter.convertFromJava(ValueConverter.java:87)
    at com.marklogic.client.io.DocumentMetadataHandle.sendPropertiesImpl(DocumentMetadataHandle.java:873)
    at com.marklogic.client.io.DocumentMetadataHandle.sendMetadataImpl(DocumentMetadataHandle.java:776)
    at com.marklogic.client.io.DocumentMetadataHandle.write(DocumentMetadataHandle.java:559)
    at com.marklogic.client.impl.StreamingOutputImpl.write(StreamingOutputImpl.java:48)
    at com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider.writeTo(StreamingOutputProvider.java:71)
    at com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider.writeTo(StreamingOutputProvider.java:57)
    at com.sun.jersey.multipart.impl.MultiPartWriter.writeTo(MultiPartWriter.java:218)
    at com.sun.jersey.multipart.impl.MultiPartWriter.writeTo(MultiPartWriter.java:71)
    at com.sun.jersey.api.client.RequestWriter$RequestEntityWriterImpl.writeRequestEntity(RequestWriter.java:231)
    at com.sun.jersey.client.apache4.ApacheHttpClient4Handler$2.writeTo(ApacheHttpClient4Handler.java:262)
    at org.apache.http.entity.HttpEntityWrapper.writeTo(HttpEntityWrapper.java:96)
    at org.apache.http.impl.client.EntityEnclosingRequestWrapper$EntityWrapper.writeTo(EntityEnclosingRequestWrapper.java:108)
    at org.apache.http.impl.entity.EntitySerializer.serialize(EntitySerializer.java:120)
    at org.apache.http.impl.AbstractHttpClientConnection.sendRequestEntity(AbstractHttpClientConnection.java:263)
    at org.apache.http.impl.conn.AbstractClientConnAdapter.sendRequestEntity(AbstractClientConnAdapter.java:227)
    at org.apache.http.protocol.HttpRequestExecutor.doSendRequest(HttpRequestExecutor.java:255)
    at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:123)
    at org.apache.http.impl.client.DefaultRequestDirector.tryExecute(DefaultRequestDirector.java:633)
    at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:454)
    at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:820)
    at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:776)
    at com.sun.jersey.client.apache4.ApacheHttpClient4Handler.handle(ApacheHttpClient4Handler.java:170)
    ... 19 more
Caused by: javax.xml.stream.XMLStreamException: Trying to output non-whitespace characters outside main element tree (in prolog or epilog)
    at com.ctc.wstx.sw.BaseStreamWriter.throwOutputError(BaseStreamWriter.java:1564)
    at com.ctc.wstx.sw.BaseStreamWriter.reportNwfStructure(BaseStreamWriter.java:1593)
    at com.ctc.wstx.sw.BaseStreamWriter.writeCharacters(BaseStreamWriter.java:420)
    at com.marklogic.client.io.DocumentMetadataHandle$ValueSerializer.process(DocumentMetadataHandle.java:902)
    ... 43 more
com.sun.jersey.api.client.ClientHandlerException: com.marklogic.client.MarkLogicIOException: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
    at com.sun.jersey.client.apache4.ApacheHttpClient4Handler.handle(ApacheHttpClient4Handler.java:184)
    at com.marklogic.client.impl.DigestChallengeFilter.handle(DigestChallengeFilter.java:34)
    at com.sun.jersey.api.client.filter.HTTPDigestAuthFilter.handle(HTTPDigestAuthFilter.java:493)
    at com.sun.jersey.api.client.Client.handle(Client.java:648)
    at com.sun.jersey.api.client.WebResource.handle(WebResource.java:680)
    at com.sun.jersey.api.client.WebResource.access$200(WebResource.java:74)
    at com.sun.jersey.api.client.WebResource$Builder.post(WebResource.java:568)
    at com.marklogic.client.impl.JerseyServices.doPost(JerseyServices.java:3940)
    at com.marklogic.client.impl.JerseyServices.postResource(JerseyServices.java:3227)
    at com.marklogic.client.impl.JerseyServices.postBulkDocuments(JerseyServices.java:3345)
    at com.marklogic.client.impl.DocumentManagerImpl.write(DocumentManagerImpl.java:619)
    at com.marklogic.client.impl.GenericDocumentImpl.write(GenericDocumentImpl.java:1)
    at com.marklogic.client.impl.DocumentManagerImpl.write(DocumentManagerImpl.java:611)
    at com.marklogic.client.impl.GenericDocumentImpl.write(GenericDocumentImpl.java:1)
    at com.marklogic.client.datamovement.impl.WriteBatcherImpl$BatchWriter.run(WriteBatcherImpl.java:1032)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Caused by: com.marklogic.client.MarkLogicIOException: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
    at com.marklogic.client.io.DocumentMetadataHandle$ValueSerializer.process(DocumentMetadataHandle.java:904)
    at com.marklogic.client.impl.ValueConverter.convertFromJava(ValueConverter.java:270)
    at com.marklogic.client.impl.ValueConverter.convertFromJava(ValueConverter.java:87)
    at com.marklogic.client.io.DocumentMetadataHandle.sendPropertiesImpl(DocumentMetadataHandle.java:873)
    at com.marklogic.client.io.DocumentMetadataHandle.sendMetadataImpl(DocumentMetadataHandle.java:776)
    at com.marklogic.client.io.DocumentMetadataHandle.write(DocumentMetadataHandle.java:559)
    at com.marklogic.client.impl.StreamingOutputImpl.write(StreamingOutputImpl.java:48)
    at com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider.writeTo(StreamingOutputProvider.java:71)
    at com.sun.jersey.core.impl.provider.entity.StreamingOutputProvider.writeTo(StreamingOutputProvider.java:57)
    at com.sun.jersey.multipart.impl.MultiPartWriter.writeTo(MultiPartWriter.java:218)
    at com.sun.jersey.multipart.impl.MultiPartWriter.writeTo(MultiPartWriter.java:71)
    at com.sun.jersey.api.client.RequestWriter$RequestEntityWriterImpl.writeRequestEntity(RequestWriter.java:231)
    at com.sun.jersey.client.apache4.ApacheHttpClient4Handler$2.writeTo(ApacheHttpClient4Handler.java:262)
    at org.apache.http.entity.HttpEntityWrapper.writeTo(HttpEntityWrapper.java:96)
    at org.apache.http.impl.client.EntityEnclosingRequestWrapper$EntityWrapper.writeTo(EntityEnclosingRequestWrapper.java:108)
    at org.apache.http.impl.entity.EntitySerializer.serialize(EntitySerializer.java:120)
    at org.apache.http.impl.AbstractHttpClientConnection.sendRequestEntity(AbstractHttpClientConnection.java:263)
    at org.apache.http.impl.conn.AbstractClientConnAdapter.sendRequestEntity(AbstractClientConnAdapter.java:227)
    at org.apache.http.protocol.HttpRequestExecutor.doSendRequest(HttpRequestExecutor.java:255)
    at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:123)
    at org.apache.http.impl.client.DefaultRequestDirector.tryExecute(DefaultRequestDirector.java:633)
    at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:454)
    at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:820)
    at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:776)
    at com.sun.jersey.client.apache4.ApacheHttpClient4Handler.handle(ApacheHttpClient4Handler.java:170)
    ... 19 more
Caused by: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
    at com.ctc.wstx.sw.BaseStreamWriter.throwOutputError(BaseStreamWriter.java:1564)
    at com.ctc.wstx.sw.RepairingNsStreamWriter.writeAttribute(RepairingNsStreamWriter.java:124)
    at com.marklogic.client.io.DocumentMetadataHandle$ValueSerializer.process(DocumentMetadataHandle.java:900)
    ... 43 more
16:32:45.178 [pool-1-thread-1] WARN  c.m.c.d.impl.WriteBatcherImpl - Error writing batch: com.sun.jersey.api.client.ClientHandlerException: com.marklogic.client.MarkLogicIOException: javax.xml.stream.XMLStreamException: Trying to output non-whitespace characters outside main element tree (in prolog or epilog)
16:32:45.179 [pool-1-thread-1] DEBUG c.m.client.impl.JerseyServices - Posting documents
16:32:45.180 [pool-1-thread-1] DEBUG c.m.client.impl.JerseyServices - Sending multipart for /v1/documents
16:32:45.183 [pool-1-thread-1] WARN  c.m.c.d.impl.WriteBatcherImpl - Error writing batch: com.sun.jersey.api.client.ClientHandlerException: com.marklogic.client.MarkLogicIOException: javax.xml.stream.XMLStreamException: Trying to write an attribute when there is no open start element.
com.marklogic.client.FailedRequestException: Local message: failed to apply resource at documents: Bad Request. Server Message: XDMP-DOCSTARTTAGCHAR: xdmp:get-request-part-body("xml") -- Unexpected character "&quot;" in start tag at  line 1
    at com.marklogic.client.impl.JerseyServices.checkStatus(JerseyServices.java:4243)
    at com.marklogic.client.impl.JerseyServices.postResource(JerseyServices.java:3257)
    at com.marklogic.client.impl.JerseyServices.postBulkDocuments(JerseyServices.java:3345)
    at com.marklogic.client.impl.DocumentManagerImpl.write(DocumentManagerImpl.java:619)
    at com.marklogic.client.impl.GenericDocumentImpl.write(GenericDocumentImpl.java:1)
    at com.marklogic.client.impl.DocumentManagerImpl.write(DocumentManagerImpl.java:611)
    at com.marklogic.client.impl.GenericDocumentImpl.write(GenericDocumentImpl.java:1)
    at com.marklogic.client.datamovement.impl.WriteBatcherImpl$BatchWriter.run(WriteBatcherImpl.java:1032)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at com.marklogic.client.datamovement.impl.WriteBatcherImpl.lambda$5(WriteBatcherImpl.java:658)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
16:32:45.223 [pool-3-thread-1] WARN  c.m.c.d.impl.WriteBatcherImpl - Error writing batch: com.marklogic.client.FailedRequestException: Local message: failed to apply resource at documents: Bad Request. Server Message: XDMP-DOCSTARTTAGCHAR: xdmp:get-request-part-body("xml") -- Unexpected character "&quot;" in start tag at  line 1
16:32:45.224 [main] DEBUG c.m.client.impl.JerseyServices - Posting eval
S is true
WriteHostBatcher-1
Updated 13/02/2017 19:25 7 Comments

Thaumcraft - Golem Animation Core - Use

combak/ShatteredWorld

Der Golem Animation Core “Use”, sollte es ermöglichen bei der Auswahl des Golems (mit der Golemancers Bell) mehrere Zielblöcke zuzuweisen. Hier ist es offenbar nur möglich einen Block zuzuweisen (Auch ohne Einsatz eines Cores sollten mehrere Blöcke zuweisbar sein).

Die Intention war, mit diesem eine Baum-Farm zu erstellen, bei der dieser Golem die Bäume anpflanzt. Dies wird ad absurdum geführt, wenn nur ein Setzling von dem Golem gepflanzt werden kann.

Updated 17/01/2017 17:06

Text data is not properly writing to marklogic database using hadoop connector

marklogic/marklogic-contentpump

I am reading a sample csv data then using hadoop connector to write into marklogic database as Text. Problem is, few data is writen to the databse random number of times. lets say, i am storing 10 records so there are 10 insertions to the marklogic database but few records are written multiple times randomly not sure why it is happening?

I have shared my code as well. Can anybody let me know why is this such randomness in behaviour to write with hadoop connector api with marklogic?

Here is the mapper code ```` public static class CSVMapper extends Mapper<LongWritable, Text, DocumentURI, Text> { static int i = 1; public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException { // TODO Auto-generated method stub ObjectMapper mapper = new ObjectMapper(); String line = value.toString(); //line contains one line of your csv file. System.out.println(“line value is - ”+line);

       String[] singleData = line.split("\n");
        for(String lineData : singleData)
        {
            String[] fields = lineData.split(",");
            Sample sd = new Sample(fields[0], fields[1], fields[2].trim(), fields[3]);

            String jsonInString = mapper.writeValueAsString(sd);
            Text txt = new Text();
             txt.set(jsonInString);
            //do your processing here
            System.out.println("line Data is    - "+line);
            System.out.println("jsonInString is -  "+jsonInString);
            final DocumentURI outputURI1 = new DocumentURI("HadoopMarklogicNPPES-"+i+".json");
            i++;

            context.write(outputURI1,txt);                      
        }   
}

} ````

Here is the main method - `Configuration conf = new Configuration(); String[] otherArgs = new GenericOptionsParser(conf, args).getRemainingArgs(); Job job = Job.getInstance(conf, “Hadoop Marklogic MarklogicHadoopCSVDataDump”); job.setJarByClass(MarklogicHadoopCSVDataDump.class);

// Map related configuration
job.setMapperClass(CSVMapper.class);

job.setMapOutputKeyClass(DocumentURI.class);
job.setMapOutputValueClass(Text.class);
job.setOutputFormatClass(ContentOutputFormat.class); 
ContentInputFormatTest.setInputPaths(job, new Path("/marklogic/sampleData.csv"));
conf = job.getConfiguration();
conf.addResource("hadoopMarklogic.xml");        

try {
    System.exit(job.waitForCompletion(true) ? 0 : 1);
} catch (ClassNotFoundException | InterruptedException e) {
    // TODO Auto-generated catch block
    e.printStackTrace();
}`

Here is the sample csv data - "Complaint ID "," Product "," Sub-product "," Issue "1350210 "," Bank account or service "," Other bank product/service "," Account opening closing or management " "1348006 "," Debt collection "," Other (phone health club etc.) "," Improper contact or sharing of info " "1351347 "," Bank account or service "," Checking account "," Problems caused by my funds being low" "1347916 "," Debt collection "," Payday loan "," Communication tactics" "1348296 "," Credit card "," "," Identity theft / Fraud / Embezzlement" "1348136 "," Money transfers "," International money transfer "," Money was not available when promised"

Updated 12/01/2017 07:49 4 Comments

Role-Based Color Scheme

bueltge/Adminimize

Ideally, the plan is to make it so when someone becomes a certain role on my website, it changes their website color scheme.

For instance, with one role, I’d like it to have, say, “Sunrise” as a theme, but for another, “Corporate”.

Updated 03/01/2017 13:42

Unable to hide wordpress logo and visit site

bueltge/Adminimize

Hello, Post all updates 1.10.6 of adminimize plugin, the “wordpress logo with full menu” and “visit store” on admin bar front end option is still visible even after tick to hide. All versions after 1.10.6 has no effect of tick/untick of above properties and still visible on top left of wordpress site. Please can you check and advise. Thanks

Updated 30/01/2017 13:12 4 Comments

Issue1357 161

LigaData/Kamanja

I have addressed the issue 1.5.1 and somehow it didn’t make it to later versions. I have merged the changes into 1.6.1 branch and done some unit-testing. Let me know If I need to improve the messages further.

Updated 30/12/2016 00:28

Issue1123

LigaData/Kamanja

For Java/Scala Models, The package name specified within the model code should match the nameSpace of modelName. Otherwise, fail the Add/Update Model operations

Updated 29/12/2016 00:51

EvalTest fails in develop branch.

marklogic/java-client-api

The following test fails in the nightly regression run in the EA4 server build. I don’t see this test failing on Jenkins environment, where the server build used is 9.0 trunk.

<testsuite tests="6" failures="1" name="com.marklogic.client.test.EvalTest" time="4.448" errors="0" skipped="0">
...
...

<testcase classname="com.marklogic.client.test.EvalTest" name="test_582_need_privilege" time="0.006">
    <failure message="a FailedRequestException should have been thrown since rest_admin doesn't have eval privileges" type="java.lang.AssertionError">java.lang.AssertionError: a FailedRequestException should have been thrown since rest_admin doesn't have eval privileges
    at org.junit.Assert.fail(Assert.java:88)
    at com.marklogic.client.test.EvalTest.test_582_need_privilege(EvalTest.java:463)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
    at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
    at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
    at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
    at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
    at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
    at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
    at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
    at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
    at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
    at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
    at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
    at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:252)
    at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:141)
    at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:112)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:189)
    at org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:165)
    at org.apache.maven.surefire.booter.ProviderFactory.invokeProvider(ProviderFactory.java:85)
    at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:115)
    at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:75)
</failure>
Updated 23/01/2017 21:24 4 Comments

Nextcloud app doesn't set up HSTS

uboslinux/ubos-nextcloud

After installing Nextcooud10 with the UBOS Virtualbox machine, Nextcloud gives the following security warning in the admin section:

The “Strict-Transport-Security” HTTP header is not configured to at least “15552000” seconds. For enhanced security we recommend enabling HSTS as described in our security tips.

Really this should be set up, either to begin with or when Nextcloud is installed.

Updated 26/03/2017 01:20 1 Comments

REGR: Authentication failure on creating client connection using Kerberos

marklogic/java-client-api

We are seeing 401 Unathorized on client connection using kerberos , klist does show the principal user and setup does seem right.

klist
Ticket cache: FILE:/tmp/krb5cc_3539
Default principal: user2@MLTEST1.LOCAL

Valid starting       Expires              Service principal
12/06/2016 15:44:41  12/07/2016 01:44:41  krbtgt/MLTEST1.LOCAL@MLTEST1.LOCAL
        renew until 12/13/2016 15:44:41

```log <testcase name=“testWriteTextDoc” classname=“com.marklogic.client.functionaltest.TestDatabaseClientWithKerberos” time=“0.595”> failure message=“com.marklogic.client.FailedRequestException: Local message: write failed: Unauthorized. Server Message: Unauthorized” type=“com.marklogic.client.FailedRequestException”>com.marklogic.client.FailedRequestException: Local message: write failed: Unauthorized. Server Message: Unauthorized at com.marklogic.client.impl.JerseyServices.putPostDocumentImpl(JerseyServices.java:1413) at com.marklogic.client.impl.JerseyServices.putDocument(JerseyServices.java:1206) at com.marklogic.client.impl.DocumentManagerImpl.write(DocumentManagerImpl.java:924) at com.marklogic.client.impl.DocumentManagerImpl.write(DocumentManagerImpl.java:762) at com.marklogic.client.impl.DocumentManagerImpl.write(DocumentManagerImpl.java:692) at com.marklogic.client.functionaltest.TestDatabaseClientWithKerberos.testWriteTextDoc(TestDatabaseClientWithKerberos.java:556) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)

Updated 23/01/2017 21:45 7 Comments

Job not getting stopped when number of available hosts < 'minHosts' property

marklogic/java-client-api

This issue was observed with a specific forest configuration described below: A. This test was run on a 3 node cluster (rh7v-intel64-90-test-4/5/6.marklogic.com) with a forest (WriteBatcher-1,2,3) on each of the node which are associated with a db. ‘WriteBatcher-1’ is not configured for failover. ‘WriteBatcher-3’ is configured to fail over to host ‘rh7v-intel64-90-test-5.marklogic.com’. ‘WriteBatcher-2’ is configured to fail over to host ‘rh7v-intel64-90-test-4.marklogic.com’

  1. Now when ‘ihb2’ WB job is getting executed,nodes rh7v-intel64-90-test-6.marklogic.com is first stopped .

21:16:24.885 [main] ERROR c.m.c.d.HostAvailabilityListener - ERROR: host unavailable “rh7v-intel64-90-test-6.marklogic.com”, black-listing it for PT15S The forest fails over to ‘rh7v-intel64-90-test-5.marklogic.com’. The writing of document to db resumes once the failover is complete.

  1. Now ‘rh7v-intel64-90-test-5.marklogic.com’ is stopped. It gets blacklisted 21:17:02.508 [main] ERROR c.m.c.d.HostAvailabilityListener - ERROR: host unavailable “rh7v-intel64-90-test-5”, black-listing it for PT15S

  2. After that, the job is stopped as available hosts < minHosts

21:17:02.772 [pool-1-thread-1] ERROR c.m.c.d.HostAvailabilityListener - Encountered [com.sun.jersey.api.client.ClientHandlerException: org.apache.http.NoHttpResponseException: The target server failed to respond] on host “rh7v-intel64-90-test-5.marklogic.com” but black-listing it would drop job below minHosts (2), so stopping job “unnamed”.

  1. After that , retrying of failed batches keeps running infinitely

21:17:02.550 [main] WARN c.m.c.d.HostAvailabilityListener - Retrying failed batch: 132, results so far: 2640, uris: [/local/ABC-2620, /local/ABC-2621, /local/ABC-2622, /local/ABC-2623, /local/ABC-2624, /local/ABC-2625, /local/ABC-2626, /local/ABC-2627, /local/ABC-2628, /local/ABC-2629, /local/ABC-2630, /local/ABC-2631, /local/ABC-2632, /local/ABC-2633, /local/ABC-2634, /local/ABC-2635, /local/ABC-2636, /local/ABC-2637, /local/ABC-2638, /local/ABC-2639]

  1. The client process was killed after sometime and the client logs and stack trace have been attached. Client log Stack trace

Test:

@Test
public void testFailOver() throws Exception{
    try{
        final String query1 = "fn:count(fn:doc())";

        final AtomicInteger successCount = new AtomicInteger(0);

        final MutableBoolean failState = new MutableBoolean(false);
        final AtomicInteger failCount = new AtomicInteger(0);

        WriteBatcher ihb2 =  dmManager.newWriteBatcher();
        ihb2.withBatchSize(20);
        //ihb2.withThreadCount(120);


        ihb2.setBatchFailureListeners(
                  new HostAvailabilityListener(dmManager)
                    .withSuspendTimeForHostUnavailable(Duration.ofSeconds(15))
                    .withMinHosts(2)
                );  
        ihb2.onBatchSuccess(
               batch -> {

                    successCount.addAndGet(batch.getItems().length);
                    System.out.println("Success Host: "+ batch.getClient().getHost());
                    System.out.println("Success batch number: "+ batch.getJobBatchNumber());
                     System.out.println("Success Job writes so far: "+ batch.getJobWritesSoFar());
                  }
                )
                .onBatchFailure(
                  (batch, throwable) -> {
                      System.out.println("Failed batch number: "+ batch.getJobBatchNumber());
                      /*try{
                          System.out.println("Retrying batch: "+ batch.getJobBatchNumber());
                          ihb2.retry(batch);
                      }
                     catch(Exception e){
                         System.out.println("Retry of batch "+ batch.getJobBatchNumber()+ " failed");
                         e.printStackTrace();
                     }*/

                      throwable.printStackTrace();
                      failState.setTrue();
                      failCount.addAndGet(batch.getItems().length);
                  });


        dmManager.startJob(ihb2);    

        for (int j =0 ;j < 20000; j++){
            String uri ="/local/ABC-"+ j;
            ihb2.add(uri, stringHandle);
        }


        ihb2.flushAndWait();


        System.out.println("Fail : "+failCount.intValue());
        System.out.println("Success : "+successCount.intValue());
        System.out.println("Count : "+ dbClient.newServerEval().xquery(query1).eval().next().getNumber().intValue());

        Assert.assertTrue(dbClient.newServerEval().xquery(query1).eval().next().getNumber().intValue()==20000);

    }
    catch(Exception e){
        e.printStackTrace();
    }
}
Updated 20/01/2017 21:53 7 Comments

Able to register HostAvailabilityListener with 'minHosts' greater than available hosts

marklogic/java-client-api

This issue occurs with HostAvailabilityListener registered with QB or WB. Consider a single node setup with a single forest associated with a db. The following configuration for HostAvailabilityListener runs fine when registered with QB as well WB without throwing any exceptions / error thrown.

batcger.setBatchFailureListeners(
 new HostAvailabilityListener(dmManager)
  .withSuspendTimeForHostUnavailable(Duration.ofSeconds(15))
  .withMinHosts(2)
);
Updated 12/01/2017 23:35 1 Comments

Job hangs when available hosts < min hosts instead of terminating

marklogic/java-client-api
  1. This test was run on a 3 node cluster (rh7v-intel64-90-test-4/5/6.marklogic.com) with a forest (ApplyTransform1,2,3) on each of the node associated with a db.
  2. Now when ‘batcher ’ QB job is getting executed,nodes rh7v-intel64-90-test-5/6.marklogic.com are stopped one after the other. new HostAvailabilityListener(dmManager) .withSuspendTimeForHostUnavailable(Duration.ofSeconds(15)) .withMinHosts(2)

C.From the log, it can be seen that rh7v-intel64-90-test-6.marklogic.com is black listed but once rh7v-intel64-90-test-5.marklogic.com is stopped , it is not black listed. D. Instead,the process hangs forever. The stack trace and client logs are attached.

Client log jstack.txt

    @Test
    public void xQueryMasstransformReplace() throws Exception{

        WriteBatcher ihb2 =  dmManager.newWriteBatcher();
        ihb2.withBatchSize(27).withThreadCount(10);
        ihb2.onBatchSuccess(
                batch -> {


                }
                )
        .onBatchFailure(
                (batch, throwable) -> {
                    throwable.printStackTrace();
                });

        dmManager.startJob(ihb2);

        for (int j =0 ;j < 2000; j++){
            String uri ="/local/string-"+ j;
            ihb2.add(uri, meta2, stringHandle);
        }

        ihb2.flushAndWait();

        ServerTransform transform = new ServerTransform("add-attr-xquery-transform");
        transform.put("name", "Lang");
        transform.put("value", "English");

        AtomicInteger skipped = new AtomicInteger(0);
        AtomicInteger success = new AtomicInteger(0);
        AtomicInteger failure = new AtomicInteger(0);

        ApplyTransformListener listener = new ApplyTransformListener()
                .withTransform(transform)
                .withApplyResult(ApplyResult.REPLACE)
                .onSuccess(batch -> {
                    success.addAndGet(batch.getItems().length);
                }). 
                onBatchFailure((batch, throwable) -> {
                    failure.addAndGet(batch.getItems().length);
                    throwable.printStackTrace();
                }).onSkipped(batch -> {
                    skipped.addAndGet(batch.getItems().length);

                });

        QueryBatcher batcher = dmManager.newQueryBatcher(new StructuredQueryBuilder().collection("XmlTransform"))
                .onUrisReady(listener).withBatchSize(7);
        batcher.setQueryFailureListeners(
                  new HostAvailabilityListener(dmManager)
                    .withSuspendTimeForHostUnavailable(Duration.ofSeconds(15))
                    .withMinHosts(2)
                );  
        JobTicket ticket = dmManager.startJob( batcher );
        batcher.awaitCompletion();
        dmManager.stopJob(ticket);
        System.out.println("Success "+ success.intValue());
        System.out.println("Failure "+failure.intValue());
        String uris[] = new String[2000];
        for(int i =0;i<2000;i++){
            uris[i] = "/local/string-"+ i;
        }
        int count=0;
        DocumentPage page = dbClient.newDocumentManager().read(uris);
        DOMHandle dh = new DOMHandle();
        while(page.hasNext()){
            DocumentRecord rec = page.next();
            rec.getContent(dh);
            assertTrue("Element has attribure ? :",dh.get().getElementsByTagName("foo").item(0).hasAttributes());
            assertEquals("Attribute value should be English","English",dh.get().getElementsByTagName("foo").item(0).getAttributes().item(0).getNodeValue());
            count++;
        }

        assertEquals("document count", 2000,count); 
        assertEquals("document count", 2000,success.intValue()); 
        assertEquals("document count", 0,skipped.intValue()); 
    }
Updated 13/01/2017 21:37 3 Comments

Tests fail if a go-ipfs daemon is already running

ipfs/go-ipfs

Version information:

Current master: https://github.com/ipfs/go-ipfs/commit/a542dea5d

Type:

Bug

Priority:

P1

Description:

If an ipfs daemon is active TEST_NO_FUSE=1 make test fails like this ( tested only under the same user account, not sure what happens if the daemon runs under a dedicated/separate user but still occupies the usual ports )

Updated 24/03/2017 19:46 1 Comments

HostAvailabilityListener with QueryBatcher results in incorrect behavior

marklogic/java-client-api
  1. The test used here is the same as reference in #568 . It queries all documents in collection “Replace Snapshot”, applies transform on them and then finally deletes the document in a single QueryBatcher instance.
  2. This test was run on a 3 node cluster (rh7v-intel64-90-test-4/5/6.marklogic.com) with a forest (ApplyTransform1,2,3) on each of the node associated with a db.
  3. The forest on node rh7v-intel64-90-test-6.marklogic.com has been configured for shared disk failover to failover host rh7v-intel64-90-test-4.marklogic.com.
  4. Now when ‘batcher ’ QB job is getting executed,node rh7v-intel64-90-test-6.marklogic.com is stopped.
  5. After 30 seconds, the forest fails over to rh7v-intel64-90-test-4.marklogic.com and db once again becomes available
  6. But from the logs, it can be seen that the document with uri “/local/snapshot-0” has not been deleted from the db (it was confirmed that the document was available using QConsole as well).

Client log

Test:

java.lang.AssertionError
    at org.junit.Assert.fail(Assert.java:86)
    at org.junit.Assert.assertTrue(Assert.java:41)
    at org.junit.Assert.assertFalse(Assert.java:64)
    at org.junit.Assert.assertFalse(Assert.java:74)
    at com.marklogic.client.datamovement.functionaltests.ApplyTransformTest.jsMasstransformReplaceDelete(ApplyTransformTest.java:669)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
    at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
    at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
    at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
    at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
    at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
    at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
    at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
    at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
    at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
    at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
    at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
    at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
    at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:86)
    at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
    at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:459)
    at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:675)
    at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:382)
    at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:192)
Updated 13/01/2017 21:14

Removing the canonical class name from the document when using PojoRepository

marklogic/java-client-api

This is a question -

I was testing the PojoRepository.write and noticed that the documents are created with the canonical name of the class. Is there a way I can use the simple name of the class?

Say for example, instead of having this

{
  "org.sanju.ml.client.odm.pojo.QuoteRequest": {
    "id": "Q-12-01-2015-123-123",
    "symbol": "AAPL",
    "quantity": 120,
    "client": {
      "org.sanju.ml.client.odm.pojo.Client": {
        "id": "C110",
        "account": {
          "org.sanju.ml.client.odm.pojo.Account": {
            "id": "A-100-1100"
          }
        }
      }
    }
  }
}

Can I have this?

{
  "id": "Q-12-01-2015-123-123",
  "symbol": "AAPL",
  "quantity": 120,
  "client": {
    "id": "C110",
    "account": {
      "Account": {
        "id": "A-100-1100"
      }
    }
  }
}

Also, how do I control the URI of the document when I use PojoRepository?

Updated 22/03/2017 17:15 4 Comments

Process hang with QueryBatcher during forest failover

marklogic/java-client-api
  1. The following test is performed on 3 node cluster with a db associated with 3 forests. 18:07:20.277 [main] INFO c.m.c.d.impl.WriteBatcherImpl - (withForestConfig) Using [rh7v-intel64-90-test-5.marklogic.com, rh7v-intel64-90-test-6.marklogic.com, rh7v-intel64-90-test-4.marklogic.com] hosts with forests for "ApplyTransform" 18:07:20.807 [main] INFO c.m.c.d.impl.WriteBatcherImpl - Adding DatabaseClient on port 8000 for host "rh7v-intel64-90-test-5.marklogic.com" to the rotation 18:07:20.807 [main] DEBUG c.m.client.impl.JerseyServices - Connecting to rh7v-intel64-90-test-6.marklogic.com at 8000 as admin 18:07:21.166 [main] INFO c.m.c.d.impl.WriteBatcherImpl - Adding DatabaseClient on port 8000 for host "rh7v-intel64-90-test-6.marklogic.com" to the rotation 18:07:21.166 [main] INFO c.m.c.d.impl.WriteBatcherImpl - Adding DatabaseClient on port 8000 for host "rh7v-intel64-90-test-4.marklogic.com" to the rotation
  2. When transformation is taking place, one of the host “rh7v-intel64-90-test-6.marklogic.com” is stopped 21:40:26.896 [pool-1-thread-1] ERROR c.m.c.d.HostAvailabilityListener - ERROR: host unavailable "rh7v-intel64-90-test-6.smarklogic.com", black-listing it for PT50S Server Error log:
2016-11-27 18:10:24.170 Info: Stopping XDQPServerConnection, client=rh7v-intel64-90-test-6.marklogic.com, conn=172.18.132.40:7999-172.18.132.42:4380
  1. Before Forest failover occurs (which takes place in 30 seconds after node shutdown) , node “rh7v-intel64-90-test-6.marklogic.com” is restarted.

Error Log: 2016-11-27 18:10:38.135 Info: Starting XDQPClientConnection, server=rh7v-intel64-90-test-6.marklogic.com, conn=172.18.132.40:38312-172.18.132.42:7999 2016-11-27 18:10:38.138 Info: Starting XDQPClientConnection, server=rh7v-intel64-90-test-6.marklogic.com, conn=172.18.132.40:38314-172.18.132.42:7999 2016-11-27 18:10:38.139 Debug: Retrying AppRequestTask::handleEvalLocked apply-transform.xqy 4333219226342543433 Update 11 because XDMP-XDQPDISC: XDQP connection disconnected, server=rh7v-intel64-90-test-6.marklogic.com 2016-11-27 18:10:38.142 Info: Starting domestic XDQPServerConnection, client=rh7v-intel64-90-test-6.marklogic.com, conn=172.18.132.40:7999-172.18.132.42:4388 2016-11-27 18:10:42.942 Info: Mounted forest ApplyTransform-3 remotely on rh7v-intel64-90-test-6.marklogic.com 2016-11-27 18:10:45.494 Info: Database ApplyTransform is online with 3 forests D. After 15 seconds, elapse, the following message is seen in the client log.

18:10:44.378 [pool-3-thread-1] INFO  c.m.c.d.HostAvailabilityListener - it's been PT15S since host rh7v-intel64-90-test-6.marklogic.com failed, opening communication to all server hosts [[rh7v-intel64-90-test-4.marklogic.com, rh7v-intel64-90-test-5.marklogic.com, rh7v-intel64-90-test-6.marklogic.com]]
18:10:44.378 [pool-3-thread-1] INFO  c.m.c.d.impl.QueryBatcherImpl - (withForestConfig) Using [rh7v-intel64-90-test-5.marklogic.com, rh7v-intel64-90-test-6.marklogic.com, rh7v-intel64-90-test-4.marklogic.com] hosts with forests for "ApplyTransform"
  1. Server is stopped again and now forest failover takes place Error log: 2016-11-27 18:10:48.319 Info: Stopping XDQPServerConnection, client=rh7v-intel64-90-test-6.marklogic.com, conn=172.18.132.40:7999-172.18.132.42:4398, requests=0, recvTics=0, sendTics=1, recvs=566, sends=517, recvBytes=87352, sendBytes=204140 2016-11-27 18:11:19.971 Info: Database ApplyTransform is offline 2016-11-27 18:11:19.974 Info: Unmounted forest ApplyTransform-3 because disconnected host 2016-11-27 18:11:19.982 Notice: Failing over forest ApplyTransform-3 because the host rh7v-intel64-90-test-6.marklogic.com has gone offline 2016-11-27 18:11:19.985 Info: Forest ApplyTransform-3 state changed from unmounted to mounted 2016-11-27 18:11:20.112 Info: Forest ApplyTransform-3 state changed from mounted to recovering 2016-11-27 18:11:20.118 Info: Forest ApplyTransform-3 state changed from recovering to open

  2. The client process has been hanging since 18:11:27.207 . The server was started later Error log: 2016-11-27 18:11:31.532 Info: Starting domestic XDQPServerConnection, client=rh7v-intel64-90-test-6.marklogic.com, conn=172.18.132.40:7999-172.18.132.42:4400

The entire client, server log file as well as stack trace taken at two different times are attached. exception.txt errorlog.txt jstack.txt jstack1.txt

I am guessing a similar scenario could occur with WriteBatcher as well during failover.

Test:

@Test
    public void xQueryMasstransformReplace() throws Exception{

        WriteBatcher ihb2 =  dmManager.newWriteBatcher();
        ihb2.withBatchSize(27).withThreadCount(10);
        ihb2.onBatchSuccess(
                batch -> {


                }
                )
        .onBatchFailure(
                (batch, throwable) -> {
                    throwable.printStackTrace();
                });

        dmManager.startJob(ihb2);

        for (int j =0 ;j < 2000; j++){
            String uri ="/local/string-"+ j;
            ihb2.add(uri, meta2, stringHandle);
        }

        ihb2.flushAndWait();

        ServerTransform transform = new ServerTransform("add-attr-xquery-transform");
        transform.put("name", "Lang");
        transform.put("value", "English");

        AtomicInteger skipped = new AtomicInteger(0);
        AtomicInteger success = new AtomicInteger(0);
        AtomicInteger failure = new AtomicInteger(0);

        ApplyTransformListener listener = new ApplyTransformListener()
                .withTransform(transform)
                .withApplyResult(ApplyResult.REPLACE)
                .onSuccess(batch -> {
                    success.addAndGet(batch.getItems().length);
                }). 
                onBatchFailure((batch, throwable) -> {
                    failure.addAndGet(batch.getItems().length);
                    throwable.printStackTrace();
                }).onSkipped(batch -> {
                    skipped.addAndGet(batch.getItems().length);

                });

        QueryBatcher batcher = dmManager.newQueryBatcher(new StructuredQueryBuilder().collection("XmlTransform"))
                .onUrisReady(listener).withBatchSize(7);
        batcher.setQueryFailureListeners(
                  new HostAvailabilityListener(dmManager)
                    .withSuspendTimeForHostUnavailable(Duration.ofSeconds(15))
                    .withMinHosts(2)
                );  
        JobTicket ticket = dmManager.startJob( batcher );
        batcher.awaitCompletion();
        dmManager.stopJob(ticket);
        System.out.println("Success "+ success.intValue());
        System.out.println("Failure "+failure.intValue());
        String uris[] = new String[2000];
        for(int i =0;i<2000;i++){
            uris[i] = "/local/string-"+ i;
        }
        int count=0;
        DocumentPage page = dbClient.newDocumentManager().read(uris);
        DOMHandle dh = new DOMHandle();
        while(page.hasNext()){
            DocumentRecord rec = page.next();
            rec.getContent(dh);
            assertTrue("Element has attribure ? :",dh.get().getElementsByTagName("foo").item(0).hasAttributes());
            assertEquals("Attribute value should be English","English",dh.get().getElementsByTagName("foo").item(0).getAttributes().item(0).getNodeValue());
            count++;
        }

        assertEquals("document count", 2000,count); 
        assertEquals("document count", 2000,success.intValue()); 
        assertEquals("document count", 0,skipped.intValue()); 
    }
Updated 13/01/2017 20:51 3 Comments

Need better 404 page

uboslinux/ubos-admin

Two use cases: * user moves app from example.com/foo to example.com/bar. Client accesses example.com/foo. * user updates the device. During the update, client accesses an app that’s currently unavailable. In both cases, nice page(s) should be shown. This probably should use some kind of “match” in the Apache config.

Updated 18/03/2017 04:40

Error message for OPTIC-INVALARGS needs to be formatted correctly

marklogic/java-client-api

For a test that checks error messages when limit value is less than a positive value, the error message shows the limit variable name; prints the variable name (“length”) and the message twice.

For the above test in Java (given below):

Map<String, Object>[] literals1 = new HashMap[5];
Map<String, Object>[] literals2 = new HashMap[4];
...
...
Map<String, Object> row = new HashMap<>();          
row.put("rowId", 1); row.put("colorId", 1); row.put("desc", "ball");
literals1[0] = row; 
row = new HashMap<>();
row.put("rowId", 2); row.put("colorId", 2); row.put("desc", "square");
literals1[1] = row;
row = new HashMap<>();
row.put("rowId", 3); row.put("colorId", 1); row.put("desc", "box");
literals1[2] = row;        
row = new HashMap<>();
row.put("rowId", 4); row.put("colorId", 1); row.put("desc", "hoop");
literals1[3] = row;
row = new HashMap<>();
row.put("rowId", 5); row.put("colorId", 5); row.put("desc", "circle");
literals1[4] = row;             

row = new HashMap<>();          
row.put("colorId", 1); row.put("colorDesc", "red");
literals2[0] = row;
row = new HashMap<>();
row.put("colorId", 2); row.put("colorDesc", "blue");
literals2[1] = row;   
row = new HashMap<>();
row.put("colorId", 3); row.put("colorDesc", "black");
literals2[2] = row;        
row = new HashMap<>();
row.put("colorId", 4); row.put("colorDesc", "yellow");
literals2[3] = row;
...
// Create a new Plan.
RowManager rowMgr = client.newRowManager();
PlanBuilder p = rowMgr.newPlanBuilder();

// plans from literals
ModifyPlan plan1 = p.fromLiterals(literals1);
ModifyPlan plan2 = p.fromLiterals(literals2);
...
...
//limit with 0 length
ModifyPlan outputLimit = plan1.joinInner(plan2)
                                      .where(p.eq(p.col("colorId"), p.xs.intVal(1)))
                                      .limit(0)
                                      .select("rowId", "desc", "colorId", "colorDesc");
JacksonHandle jacksonHandle = new JacksonHandle();
jacksonHandle.setMimetype("application/json");
StringBuilder str = new StringBuilder();
try {
            rowMgr.resultDoc(outputLimit, jacksonHandle);
   }
catch(Exception ex) {
        str.append(ex.getMessage());
    }

The message is:

com.marklogic.client.FailedRequestException: Local message: failed to apply resource at rows: Bad Request. Server Message: OPTIC-INVALARGS: fn.error(null, 'OPTIC-INVALARGS', 'limit must be a positive number: '+length); -- Invalid arguments: limit must be a positive number: 0

The following text needs to be removed.

limit must be a positive number: '+length
Updated 28/02/2017 01:45 3 Comments

Kamanja update python model enters OUTMESSAGE queue incorrectly

LigaData/Kamanja

here are the add / update commands kamanja add model python $KAMANJA_HOME/input/SampleApplications/metadata/model/subtract.py MODELNAME subtract.SubtractTuple MESSAGENAME org.kamanja.arithmetic.arithmeticMsg OUTMESSAGE org.kamanja.arithmetic.arithmeticOutMsg TENANTID tenant1 MODELVERSION 0.00001 kamanja add model python $KAMANJA_HOME/input/SampleApplications/metadata/model/add.py MODELNAME add.AddTuple MESSAGENAME org.kamanja.arithmetic.arithmeticMsg OUTMESSAGE org.kamanja.arithmetic.arithmeticOutMsg TENANTID tenant1 MODELVERSION 0.00001 kamanja add model python $KAMANJA_HOME/input/SampleApplications/metadata/model/multiply.py MODELNAME multiply.MultiplyTuple MESSAGENAME org.kamanja.arithmetic.arithmeticMsg OUTMESSAGE org.kamanja.arithmetic.arithmeticOutMsg TENANTID tenant1 MODELVERSION 0.00001

kamanja update model python $KAMANJA_HOME/input/SampleApplications/metadata/model/subtract.py MODELNAME subtract.SubtractTuple MESSAGENAME org.kamanja.arithmetic.arithmeticMsg OUTMESSAGE org.kamanja.arithmetic.arithmeticOutMsg TENANTID tenant1 MODELVERSION 0.00003 MODELOPTIONS ‘{“InputTypeInfo”: {“a” : “Float”, “b” : “ Float”}}’ kamanja update model python $KAMANJA_HOME/input/SampleApplications/metadata/model/add.py MODELNAME add.AddTuple MESSAGENAME org.kamanja.arithmetic.arithmeticMsg OUTMESSAGE org.kamanja.arithmetic.arithmeticOutMsg TENANTID tenant1 MODELVERSION 0.00003 MODELOPTIONS ‘{“InputTypeInfo”: {“a” : “Float”, “b” : “ Float”}}’ kamanja update model python $KAMANJA_HOME/input/SampleApplications/metadata/model/multiply.py MODELNAME multiply.MultiplyTuple MESSAGENAME org.kamanja.arithmetic.arithmeticMsg OUTMESSAGE org.kamanja.arithmetic.arithmeticOutMsg TENANTID tenant1 MODELVERSION 0.00003 MODELOPTIONS ‘{“InputTypeInfo”: {“a” : “Float”, “b” : “ Float”}}’

Model Result ___________________________________

DEBUG [main] - model found => add.addtuple.000000000003000000 Result: { “APIResults” : { “Status Code” : 0, “Function Name” : “GetModelDefFromCache”, “Result Data” : “{\"Model\”:{\“NameSpace\”:\“add\”,\“Name\”:\“addtuple\”,\“Version\”:\“000000000003000000\”,\“TenantId\”:\“tenant1\”,\“ElementId\”:2000004,\“Description\”:\“\”,\“Comment\”:\“\”,\“Author\”:\“\”,\“Tag\”:\“\”,\“OtherParams\”:\“{}\”,\“CreatedTime\”:1473286082267,\“UpdatedTime\”:1473286082267,\“IsReusable\”:\“true\”,\“inputMsgSets\”:[[{\“Origin\”:\“\”,\“Message\”:\“org.kamanja.arithmetic.arithmeticmsg\”,\“Attributes\”:[]}]],\“OutputMsgs\”:[\“add.addtuple_outputmsg\”],\“ModelRep\”:\“PYTHON\”,\“ModelType\”:\“PYTHON\”,\“JarName\”:\“\”,\“PhysicalName\”:\“add.AddTuple\”,\“ObjectDefinition\”:\“import abc\nfrom common.ModelInstance import ModelInstance\nimport json\nimport logging\n\nclass AddTuple(ModelInstance): \n\t\”\“\” Model AddTuple will sum msg[\“a\”] and msg[\“b\”] \“\”\“\n\tdef execute(self, msg):\n\t\t\”\“\” \n\t\tA real implementation would use the output fields to \n\t\tdetermine what should be returned. \n\t\t\“\”\“\n\t\tsumofTup = int(msg[\"a\”])\n\t\tsumofTup += int(msg[\“b\”])\n\t\toutMsg = json.dumps({‘a’ : msg[\“a\”], ‘b’ : msg[\“b\”], ‘operator’ : \“+\”, ‘result’ : sumofTup})\n\t\treturn outMsg\n\n\tdef getInputFields(self):\n\t\t\“\”\“The field names and their types needed by the model are returned to \”\“\”\n\t\t\“\”\“the python proxy (model stub communicating with this server). \”\“\”\n\t\t\“\”\“Feel free to just hard code the type info if that is best. \”\“\”\n\t\t\“\”\“The returned dictionaries are used by the python proxy to choose \”\“\”\n\t\t\“\”\“which fields from the associated messages(s) to send to the python server \”\“\”\n\t\t\“\”\“when the model is executed. This is appropriate when the message contains\”\“\”\n\t\t\“\”\“a thousand fields, but the model only uses five of them. \”\“\”\n\n\t\t\“\”\“As shown, conceivably the information could be configured in the model \”\“\”\n\t\t\“\”\“options. \”\“\”\n\n\t\tself.logger.debug(\“Entered AddTuple.getInputFields\”)\n\t\tmodelOptions = super(AddTuple, self).ModelOptions()\n\t\tinputFields = dict()\n\t\tif \“InputTypeInfo\” in modelOptions:\n\t\t\tinputFields.update(modelOptions[\“InputTypeInfo\”])\n\t\telse:\n\t\t\tinputFields[\“a\”] = \“Int\”\n\t\t\tinputFields[\“b\”] = \“Int\”\n\n\t\treturn (inputFields)\n\“,\"ObjectFormat\”:\“JSON\”,\“DependencyJars\”:[],\“Deleted\”:false,\“Active\”:true,\“TransactionId\”:13,\“DepContainers\”:[]}}“, "Result Description” : “Successfully fetched Model from Cache:add.addtuple.000000000003000000” } }

3 DEBUG [main] - model found => multiply.multiplytuple.000000000001000000 Result: { “APIResults” : { “Status Code” : 0, “Function Name” : “GetModelDefFromCache”, “Result Data” : “{\"Model\”:{\“NameSpace\”:\“multiply\”,\“Name\”:\“multiplytuple\”,\“Version\”:\“000000000001000000\”,\“TenantId\”:\“tenant1\”,\“ElementId\”:2000005,\“Description\”:\“\”,\“Comment\”:\“\”,\“Author\”:\“\”,\“Tag\”:\“\”,\“OtherParams\”:\“{}\”,\“CreatedTime\”:1473275968521,\“UpdatedTime\”:1473275968521,\“IsReusable\”:\“true\”,\“inputMsgSets\”:[[{\“Origin\”:\“\”,\“Message\”:\“org.kamanja.arithmetic.arithmeticmsg\”,\“Attributes\”:[]}]],\“OutputMsgs\”:[\“org.kamanja.arithmetic.arithmeticOutMsg\”],\“ModelRep\”:\“PYTHON\”,\“ModelType\”:\“PYTHON\”,\“JarName\”:\“\”,\“PhysicalName\”:\“multiply.MultiplyTuple\”,\“ObjectDefinition\”:\“import abc\nfrom common.ModelInstance import ModelInstance\nimport json\nimport logging\n\nclass MultiplyTuple(ModelInstance): \n\t\”\“\” Model MultiplyTuple will multiply msg[\“a\”] and msg[\“b\”] \“\”\“\n\tdef execute(self, msg):\n\t\t\”\“\” \n\t\tA real implementation would use the output fields to \n\t\tdetermine what should be returned. \n\t\t\“\”\“\n\t\tprodofTups = int(msg[\"a\”])\n\t\tprodofTups = int(msg[\“b\”])\n\t\toutMsg = json.dumps({‘a’ : msg[\“a\”], ‘b’ : msg[\“b\”], ‘operator’ : ‘’, ‘result’ : prodofTups})\n\t\treturn outMsg\n\n def getInputFields(self):\n\t\t\“\”\“The field names and their types needed by the model are returned to \”\“\”\n\t\t\“\”\“the python proxy (model stub communicating with this server). \”\“\”\n\t\t\“\”\“Feel free to just hard code the type info if that is best. \”\“\”\n\t\t\“\”\“The returned dictionaries are used by the python proxy to choose \”\“\”\n\t\t\“\”\“which fields from the associated messages(s) to send to the python server \”\“\”\n\t\t\“\”\“when the model is executed. This is appropriate when the message contains\”\“\”\n\t\t\“\”\“a thousand fields, but the model only uses five of them. \”\“\”\n\n\t\t\“\”\“As shown, conceivably the information could be configured in the model \”\“\”\n\t\t\“\”\“options. \”\“\”\n\n\t\tself.logger.debug(\“Entered MultiplyTuple.getInputFields\”)\n\t\tmodelOptions = super(MultiplyTuple, self).ModelOptions()\n\t\tinputFields = dict()\n\t\tif \“InputTypeInfo\” in modelOptions:\n\t\t\tinputFields.update(modelOptions[\“InputTypeInfo\”])\n\t\telse:\n\t\t\tinputFields[\“a\”] = \“Int\”\n\t\t\tinputFields[\“b\”] = \“Int\”\n\n\t return (inputFields)\n\“,\"ObjectFormat\”:\“JSON\”,\“DependencyJars\”:[],\“Deleted\”:false,\“Active\”:true,\“TransactionId\”:9,\“DepContainers\”:[]}}“, "Result Description” : “Successfully fetched Model from Cache:multiply.multiplytuple.000000000001000000” } }

Updated 07/01/2017 02:40 2 Comments

Retrieve search results in a JSON format

marklogic/java-client-api

As a developer I want to TITLE So that I can process JSON search results instead of XML

I am applying a transform to search results from the /v1/search endpoint. Using a ServerTransform class. I want to store the search results in a JacksonHandle.

JacksonHandle results = queryMgr.search(querydef, new JacksonHandle());

But I’m getting an XQuery error returned. Below is the XQuery transform that transforms search results to JSON.

xquery version "1.0-ml";

module namespace transform = "http://marklogic.com/rest-api/transform/json-search";

import module namespace json = "http://marklogic.com/xdmp/json" at "/MarkLogic/json/json.xqy";

declare namespace search = "http://marklogic.com/appservices/search";

declare function transform(
        $context as map:map,
        $params as map:map,
        $content as document-node()
) as document-node()
{
    let $c := json:config("custom") ,
        $cx := map:put( $c, "whitespace", "ignore" ),
        $cx := map:put( $c, "text-value", "label" ),
        $cx := map:put( $c, "camel-case", fn:true() ),
        $cx := map:put( $c, "json-attributes", ("snippet-format", "total", "start", "page-length")),
        $cx := map:put( $c, "array-element-names", (xs:QName("search:result")))
    let $_ := map:put($context, "output-type", "text/json")
    let $json := json:transform-to-json(  $content ,$c )
    return $json
};
Updated 20/03/2017 22:51 5 Comments

Whitelist issues with proxy mode

benbaptist/minecraft-wrapper

The vanilla server whitelists based on online UUID’s. Proxy mode requires the server to be in offline mode. The result is that a player who is whitelisted on the vanilla server cannot connect via proxy mode.

Possible code solutions: 1) have wrapper handle whitelisting 2) have wrapper modify the whitelist.json file with the offline UUID and allow the server to handle whitelists

Solution 1 would be preferred since it ensures name changes will not allow a player to become “unwhitelisted” due to a name change.

reported by @pingpong1109

Updated 01/01/2017 19:49 1 Comments

Stress test with develop branch on 9.0 MarkLogic server reports error on Semantics' verifyLoaded

marklogic/java-client-api

With the standard MarkLogic Java/REST Client API stress test setup and run, we are seeing the following error in the Stress test client program’s log file.

2016-05-20 14:42:59 (Thread=33) ERROR (SemanticGraphLoadTester.verifyLoaded) found 0 loaded but should have found 1 of these URIs: [/SEITYXYCULKMIPUT/2016420144250157/5/0NorthwindData.xml]

The thread 33 was started with following:

2016-05-20 14:42:50 (Thread=33) ========
Starting test with uniqueURI = /SEITYXYCULKMIPUT/2016420144250157/5/
#
TestcaseID = semanticGraphs
ThreadID = 33
BinaryTest = false
JSONTest = false
InsertTime = false
NumberOfLoops = 5
MaxSleepTime = 1
LogOption = debug
WriteToScreen = true
LogFilename = auto
OutputFileName =
minRequestTimeLimit = 0
maxRequestTimeLimit = 0

numcreate = 1
checkInterval = 1000
loadDir = /space/HEAD/qa/testdata/semantics/
createModule = null
autoGenerate = true
multiStatement = false
batchSize = 1
rollback = false
generateQuery = null
numGenerated = 0
language = xquery

========
...
...
...

2016-05-20 14:42:50 (Thread=33) About to load 92 documents.

Test Env:

MarkLogic Server used: 9.0 nightly server MarkLogic Java Client API used: develop ( 4.0-SNAPSHOT )

All threads that handle 0Northwind.xml have this error message reported on the client side. There are no errors reported in the server logs (maybe due to log level set to notice).

Updated 15/03/2017 23:22

Ambiguities concerning node.Links

ipfs/go-ipfs

Let nd contain links with names ["_", "_", "__", "___"] each with different hashes.

> nd.AddNodeLink("_", someNode)
> nd.LinkNames
["_", "_", "__", "___", "_"]  # will only get sorted when nd.Encoded()/nd.Marshal()/nd.Unmarshal() is called.
> nd.GetNodeLink("_")
# returns only the first "_", never the rest
> nd.UpdateNodeLink("_")
# all nodes with name "_" removed, leaving one "_"
> nd.AddNodeLink("_", someNode)
> nd.LinkNames
["__", "___", "_", "_"]
> nd.RemoveNodeLink("_")
> nd.LinkNames
["__", "___"]
Updated 24/03/2017 19:42 1 Comments

Missing "read" dependency

marklogic/node-client-api

Examples/setup.js requires the npm “read” package, but this isn’t included in package.json.

$ node examples/setup.js 
module.js:338
    throw err;
          ^
Error: Cannot find module 'read'
    at Function.Module._resolveFilename (module.js:336:15)
    at Function.Module._load (module.js:278:25)
    at Module.require (module.js:365:17)
    at require (module.js:384:17)
    at Object.<anonymous> (/Users/dcassel/tmp/myproject/node_modules/marklogic/etc/test-setup-prompt.js:16:14)
    at Module._compile (module.js:460:26)
    at Object.Module._extensions..js (module.js:478:10)
    at Module.load (module.js:355:32)
    at Function.Module._load (module.js:310:12)
    at Module.require (module.js:365:17)
    at require (module.js:384:17)
    at Object.<anonymous> (/Users/dcassel/tmp/myproject/node_modules/marklogic/examples/setup.js:21:22)
    at Module._compile (module.js:460:26)
    at Object.Module._extensions..js (module.js:478:10)
    at Module.load (module.js:355:32)
    at Function.Module._load (module.js:310:12)
Updated 06/03/2017 21:40 5 Comments

Score value returned in search

marklogic/node-client-api

I’d like to see a feature whereby I could extract the score value from the search results in an easier fashion. At the moment the only way (as far as I know) to retrieve the score value for documents is to add the .withOptions({ debug: true }) method to qb.where().

Imagine a scenario where we have the following code:

var search = function(req, res) {
  var searchQuery = req.params.searchQuery;
  db.documents.query(
    qb.where(
      qb.collection('character'),
      qb.parsedFrom(searchQuery)
    ).withOptions({ debug: true })
  ).result().then(function(response) {
    console.log(response);
    res.json(response);
  }).catch(function(error) {
    console.log(error);
  });
};

Let’s assume that executing a the following search ‘term1 AND term2’ would produce two results. The response in the code above would now have 3 objects. 1 object containing debug information about the search (including the score in the results array) and the other 2 objects would contain information about the actual documents (including their content). There’s no easy way to display a property of these documents along with their score values at the moment.

Probably the best way to do this - albeit I’m not sure if it’s possible at all - would be to have an extra options for withOptions categories such as: .withOptions({ categories: ['content', 'score'] }) which could produce:

{ uri: '/character/lukeskywalker.json',
    category: 'content',
    format: 'json',
    contentType: 'application/json',
    contentLength: '837',
    score: 123456,
    content:
     { name: 'Luke Skywalker' }
}
Updated 15/02/2017 21:47 3 Comments

an error without an error callback emits a spurious Bluebird error

marklogic/node-client-api

The spurious error is:

node_modules/bluebird/js/main/async.js:36 fn = function () { throw arg; };

The root error is still reported. In addition, the application should provide an error callback for better error reporting.

A quick search shows that the request module found a configure solution:

https://github.com/request/request-promise/commit/bc6080e501a406eb03ec779dd50458cde1bce7aa

But more investigation is needed.

Updated 06/03/2017 21:40

unable to work with forge 1.8 server in proxy mode

benbaptist/minecraft-wrapper

I’ve tried to use the latest stable forge server 1.8 – see log below for version, but I get an error on client saying the “This server requires FML/Forge to be installed…” BTW, everything works fine if I turn off proxy mode.

[23:40:15] [main/INFO] [FML]: Forge Mod Loader version 8.0.37.1334 for Minecraft 1.8 loading [23:40:15] [main/INFO] [FML]: Java is Java HotSpot™ 64-Bit Server VM, version 1.8.0_40, running on Mac OS X:x86_64:10.10.3, installed at /Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/jre [23:40:16] [main/WARN] [FML]: The coremod codechicken.core.launch.CodeChickenCorePlugin does not have a MCVersion annotation, it may cause issues with this version of Minecraft [23:40:16] [main/INFO] [DepLoader]: Extracting file ./mods/CodeChickenCore-1.8-1.0.5.34-universal.jar!lib/CodeChickenLib-1.8-1.1.2.115-universal.jar [23:40:16] [main/INFO] [DepLoader]: Extraction complete [23:40:16] [main/WARN] [FML]: The coremod codechicken.lib.asm.CCLCorePlugin does not have a MCVersion annotation, it may cause issues with this version of Minecraft [23:40:16] [main/WARN] [FML]: The coremod codechicken.nei.asm.NEICorePlugin does not have a MCVersion annotation, it may cause issues with this version of Minecraft [23:40:16] [main/INFO] [LaunchWrapper]: Loading tweak class name net.minecraftforge.fml.common.launcher.FMLInjectionAndSortingTweaker [23:40:16] [main/INFO] [LaunchWrapper]: Loading tweak class name net.minecraftforge.fml.common.launcher.FMLDeobfTweaker [23:40:16] [main/INFO] [LaunchWrapper]: Calling tweak class net.minecraftforge.fml.common.launcher.FMLInjectionAndSortingTweaker [23:40:16] [main/INFO] [LaunchWrapper]: Calling tweak class net.minecraftforge.fml.common.launcher.FMLInjectionAndSortingTweaker [23:40:16] [main/INFO] [LaunchWrapper]: Calling tweak class net.minecraftforge.fml.relauncher.CoreModManager$FMLPluginWrapper [23:40:17] [main/INFO] [FML]: Found valid fingerprint for Minecraft Forge. Certificate fingerprint e3c3d50c7c986df74c645c0ac54639741c90a557 [23:40:18] [main/INFO] [LaunchWrapper]: Calling tweak class net.minecraftforge.fml.relauncher.CoreModManager$FMLPluginWrapper [23:40:18] [main/INFO] [LaunchWrapper]: Calling tweak class net.minecraftforge.fml.relauncher.CoreModManager$FMLPluginWrapper [23:40:18] [main/INFO] [LaunchWrapper]: Calling tweak class net.minecraftforge.fml.relauncher.CoreModManager$FMLPluginWrapper [23:40:18] [main/INFO] [LaunchWrapper]: Calling tweak class net.minecraftforge.fml.relauncher.CoreModManager$FMLPluginWrapper [23:40:18] [main/INFO] [LaunchWrapper]: Calling tweak class net.minecraftforge.fml.relauncher.CoreModManager$FMLPluginWrapper [23:40:18] [main/INFO] [LaunchWrapper]: Calling tweak class net.minecraftforge.fml.common.launcher.FMLDeobfTweaker [23:40:18] [main/INFO] [LaunchWrapper]: Loading tweak class name net.minecraftforge.fml.common.launcher.TerminalTweaker [23:40:18] [main/INFO] [LaunchWrapper]: Calling tweak class net.minecraftforge.fml.common.launcher.TerminalTweaker [23:40:18] [main/INFO] [LaunchWrapper]: Launching wrapped minecraft {net.minecraft.server.MinecraftServer} [23:40:21] [Server thread/INFO]: Starting minecraft server version 1.8 [23:40:21] [Server thread/INFO] [MinecraftForge]: Attempting early MinecraftForge initialization [23:40:21] [Server thread/INFO] [FML]: MinecraftForge v11.14.1.1334 Initialized [23:40:21] [Server thread/INFO] [FML]: Replaced 204 ore recipies

Updated 28/12/2016 05:22 7 Comments

PojoLoadTester throws REST-INVALIDMIMETYPE during failover time period.

marklogic/java-client-api

2015-03-31 20:54:36 (Thread=391) ======== Starting test with uniqueURI = /KSPQKCFLYKKFZPRL/2015231205436623/4/

TestcaseID = pojoLoadTester ThreadID = 391 BinaryTest = false JSONTest = false InsertTime = false NumberOfLoops = 5 MaxSleepTime = 1 LogOption = debug WriteToScreen = true LogFilename = auto OutputFileName =

2015-03-31 20:54:36 (Thread=391) About to load 2501 documents. 2015-03-31 20:56:32 (Thread=391) About to load 2501 documents. 2015-03-31 20:58:53 (Thread=391) About to load 2501 documents.

21:00:00 <====== One of the D Node goes down.

Here is the stak trace from the thread 391.

2015-03-31 21:06:19 (Thread=391) com.marklogic.client.FailedRequestException: Local message: search failed: Unsupported Media Type. Server Message: REST-INVALIDMIMETYPE: (rest:INVALIDMIMETYPE) Content-Type must be one of: ‘text/xml’, ‘text/json’, ‘application/xml’, ‘application/json’, Received: text/plain com.marklogic.client.FailedRequestException: Local message: search failed: Unsupported Media Type. Server Message: REST-INVALIDMIMETYPE: (rest:INVALIDMIMETYPE) Content-Type must be one of: ‘text/xml’, ‘text/json’, ‘application/xml’, ‘application/json’, Received: text/plain at com.marklogic.client.impl.JerseyServices$JerseySearchRequest.getResponse(JerseyServices.java:2206) at com.marklogic.client.impl.JerseyServices.getBulkDocumentsImpl(JerseyServices.java:869) at com.marklogic.client.impl.JerseyServices.getBulkDocuments(JerseyServices.java:744) at com.marklogic.client.impl.DocumentManagerImpl.search(DocumentManagerImpl.java:500) at com.marklogic.client.impl.JSONDocumentImpl.search(JSONDocumentImpl.java:26) at com.marklogic.client.impl.PojoRepositoryImpl.search(PojoRepositoryImpl.java:340) at com.marklogic.client.impl.PojoRepositoryImpl.search(PojoRepositoryImpl.java:325) at test.stress.JavaAPISession.searchPojos(JavaAPISession.java:526) at test.stress.PojoLoadTester.verifyIntervalAfterIteration(PojoLoadTester.java:75) at test.stress.RestLoadTester.runTest(RestLoadTester.java:283) at test.stress.StressTest.run(StressTest.java:94) 2015-03-31 21:06:19 (Thread=391) Test took: 702600ms

Updated 13/01/2017 21:25 2 Comments

PyInstaller fails to load Werkzeug modules

pyinstaller/pyinstaller

Original date: 2012/09/04 Original reporter: torsten DOT landschoff AND dynamore DOT de

I am trying to deploy an application that uses Werkzeug but I am unable to get PyInstaller to include the werkzeug.* modules. Look at this example:

#!py
from werkzeug.exceptions import InternalServerError
print "Hello World!"

Obviously this works fine when called directly from Python. Using pyinstaller on test.py without any options gives an executable which acts like this:

torsten@sharokan:~/pyinstaller-test$ ~/workspace/pyinstaller-2.0/pyinstaller.py test.py
torsten@sharokan:~/pyinstaller-test$ ./dist/test/test 
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/home/torsten/workspace/pyinstaller-2.0/PyInstaller/loader/iu.py", line 409, in importHook
    raise ImportError("No module named %s" % fqname)
ImportError: No module named werkzeug.exceptions

I created a hook to make PyInstaller include the werkzeug submodules and placed it into the directory hookspath. I enabled that hook by modifying the generated test.spec to include

#!py
a = Analysis(['test.py'],
             pathex=['/home/torsten/pyinstaller-test'],
             hiddenimports=[],
             hookspath="hookspath")

But I noticed that this does not make a difference for the build result: The file logdict2.7.3.final.0-2.log is unchanged. In fact, the original version also contains the werkzeug.exceptions module (verified with ArchiveViewer.py).

Why this module can not be loaded is beyond me. I enabled the debug output of iu.py and will attach that file. Perhaps somebody with more insight into PyInstaller internals can go and fix the loader.

The only workaround I found was to replace the init.py in the werkzeug package to remove the lazy load feature by commenting out the last lines so that new_module is not written into sys.modules[“werkzeug”].

Updated 12/02/2017 11:55 5 Comments

Fork me on GitHub