Community for yuuvis® RAD

0 votes
asked by (270 points)
edited by

Hello all,
i am trying to use the search service to export a query and further use it in a csv file.

Rather than converting the json array into a csv myself the /search/export of the search service itself comes in handy.

BUT:
I get the impression, that /search/export ist not behaving exactly the same as /search
- maybe by design - maybe i am missing something in my request?

1) If any fields are specified in the request body the call results in an error while the specification of fields for /search is working quite fine.

Error:
"The number of columns to be processed (1) must match the number of CellProcessors (2): check that the number of CellProcessors you have defined matches the expected number of columns being read/written",

2) The size parameter seems to be ignored on search/export and also the default of /search is not used (which is 100)
Once i tried /search/export without any fields specified i received the data in csv format, but all records in the specified filter (that might be a risk for this service?)

Hopefully i just missed some details in the request, but from the documentation i couldn´t tell.

Thanks a lot

Best,
Clemens

1 Answer

0 votes
answered by (11.5k points)
edited by

Hi Clemens,

topic 1) is an error, Please, send report this to our support (current ticket: COOL-11563)

topic 2) in the export case we do not use the size parameter.
a) For what purpose do you need this size paramter in case of export
b) for the security: which limit of rows to be exportable do you see?

Best Regards
Martin

commented by (270 points)
Hello Martin,
thanks for the clarification - i will wait for the bugfix on the mentioned ticket (request to be updated on your internal ticket raised with our OS Partner)

On Topic 2)
a) I think in the same way someone wanted to limit the amount of records returned by /search the same rationale should be applied for /search/export - or no?
In the query which raised my question the result set based on the given filters provided about 43.000 records - which took a while to be processed and to be provided in csv format - but it was working - i just don´t think this is a save approach. I would rather run this comand multiple times with a size parameter to get the result in manageable chunks.

b) I don´t see any limit of rows to be exportable and really don´t want to test it out as this may cause issues for the service? If we just run /search/export against a system with millions of records i am not sure if that is a good idea (to point back to point a)

Best,
Clemens

Related questions

0 votes
1 answer
asked May 7, 2019 by dennisschmidt (1.7k points)
0 votes
1 answer
asked Apr 26, 2019 by schulerb (360 points)
+1 vote
1 answer
...