API information for data server

Answered
vaibhav asked on April 14, 2022

Hi,
I am working on features of the Data server. I have few doubts regarding its usage. 
Do you have API documentation for data server. Basically we want to to manually call API and refresh the data of server, add new indexes on runtime, send custom request with customer filter to be used during the query.
 
Regards,
Vaibhav

8 answers

Public
Maksym Diachenko Maksym Diachenko Flexmonster April 15, 2022

Hello, Vaibhav!

Thank you for writing to us.

We have a documentation section about FDS, which covers basic use cases. You can find it on the left menu in the "Connecting to data source/Flexmonster Data Server" section on our documentation page.

You can achieve the mentioned functionality by using Flexmonster Data Server as a DLL. Using the DLL, you can extend base features of Flexmonster Data Server to handle additional scenarios. Please see our custom .NET server using Flexmonster.DataServer.Core.dll. This project can be a good starting point for implementing your server. Use these commands to install and run the project:

git clone https://github.com/flexmonster/flexmonster-data-server-dll
cd flexmonster-data-server-dll
cd DemoDataServerCore
dotnet restore
dotnet run

 
Manual data refresh
The general idea lies in the following:
Flexmonster Data Server DLL provides the IDataStorage interface:

public interface IDataStorage
{
   public Task<IDataStructure> GetOrAddAsync(string cacheKey);
   public void Remove(string key);
}

Using IDataStorage, you could create a custom service for reloading indexes in your application. This would allow you to refresh indexes when necessary, not just every x minutes.
Below is a code snippet showing a sample implementation of a custom index reloading service:

public interface IReloadService
{
   Task Reload(string indexName);
}

public class ReloadService : IReloadService
{
 private readonly IDataStorage _dataStorage;
   public ReloadService(IDataStorage dataStorage)
   {
      _dataStorage = dataStorage;
   }

   public async Task Reload(string indexName)
   {
      // Refresh the data index
      _dataStorage.Remove(indexName);
      await _dataStorage.GetOrAddAsync(indexName);
   }
}

 
Then you could register the service in your application, for example, in Startup.cs:

public IConfiguration Configuration { get; }

public void ConfigureServices(IServiceCollection services)
{
    // other services configurations
    services.ConfigureFlexmonsterOptions(Configuration);
    services.AddFlexmonsterApi();
    services.AddScoped<IReloadService, ReloadService>();
  // other services configurations
}

 
Adding indexes during runtime
You can use an approach similar to the one described previously for adding indexes in runtime. 

 public class JsonIndexService
    {
        public IDataStorage _dataStorage;
        public IOptionsMonitor<DatasourceOptions> _datasourceOptions;

        public JsonIndexService(IOptionsMonitor<DatasourceOptions> datasourceOptions, IDataStorage dataStorage)
        {
            _datasourceOptions = datasourceOptions;
            _dataStorage = dataStorage;
        }

        //JsonIndexOptionModel - class, containing parameters that will be passed as settings of a new index
        //You can implement it in any way you want
        public async Task Create(JsonIndexOptionModel indexOptions)

        {
            var jsonIndexOptions = new JsonIndexOptions(indexOptions.Path);
            jsonIndexOptions.RefreshTime = indexOptions.RefreshTime;
            _datasourceOptions.CurrentValue.Indexes.Add(indexOptions.IndexName, jsonIndexOptions);
            await _dataStorage.GetOrAddAsync(indexOptions.IndexName);
        }

    }
services.AddScoped<JsonIndexService>();

Custom request with customer filter
Flexmonster Data Server DLL provides the ability to create custom server-side filters. Please see the following guide on how the server filter can be implemented: https://www.flexmonster.com/doc/getting-started-with-data-server-dll/.

Custom API
Aside from using Flexmonster Data Server DLL, another option would be to create your own server implementing the custom data source API - our custom communication protocol that allows you to retrieve already aggregated data from a server to Flexmonster Pivot. Such an approach would provide even more flexibility, as you can add any additional data processing in your server-side implementation.

Please let us know if the described approaches would work for your case.

Best Regards,
Maksym

Public
Maksym Diachenko Maksym Diachenko Flexmonster April 26, 2022

Hello, Vaibhav!

We are curious whether you tried the provided functionality in Flexmonster Data Server DLL. If you prefer the Custom API solution, we are ready to provide you with more guidance. Our team will be glad to hear your feedback/thoughts.

Best Regards,
Maksym

Public
vaibhav May 4, 2022

Hi,
 
I am using custom source api for now. Using the Node.js implementation provided by flexmonster. Is there possibility that we can use the CSV files as a source data instead of json files ?
 
Regards,
Vaibhav P.

Public
vaibhav May 4, 2022

HI,
One more issue I have found is when I am using the node.js custom source API. Loading of data more than 500mb id giving error.
 
Regards,
Vaibhav P.

Attachments:
Untitled.png

Public
Maksym Diachenko Maksym Diachenko Flexmonster May 4, 2022

Hi, Vaibhav!

Thank you for your feedback.

It is possible to use the custom data source API with any data source, including CSV. Our sample project is focused on showing the data processing logic. It is using JSON files, as they are easiest to work with in terms of reading from files. However, you can further modify this project to support any needed data sources.

We suggest using the csvtojson npm library to make the sample project work with CSV without any major changes in code. This library provides the csv().fromFile(csvFilePath) method, which allows to convert CSV file to the JSON with format, used by Flexmonster. To use the CSV data source, replace code lines, using fs.readFile method with the following code:

let fileContent;
await csv({ checkType: true })
.fromFile(`${dataFolder}/${index}.csv`)
.then((jsonObj) => {
    fileContent = jsonObj;
});

The checkType property is responsible for parsing numbers inside CSV.

Speaking of ERR_STRING_TOO_LONG, this is a limitation of Node.js itself (https://nodejs.org/api/errors.html#err_string_too_long), which is not regarded to Flexmonster. Check this StackOverflow answer about this issue.

Our team has modified the sample from GitHub to illustrate a potential approach to using CSV files. You're welcome to use this implementation as a reference or adjust this approach to meet your needs.

Hope it helps. 

Best Regards,
Maksym

Attachments:
server-nodejs.zip

Public
vaibhav May 5, 2022

Hi,
Thanks for the quick support. The major problem for us is the size of CSV or JSON file. Size of single file is almost more than 1 or 2 GB which always gives the error as out of memory. I have tried the above solution provided by you. Do you recommend any possible way to handle this problem ? as I have tested almost every possible connectivity provided flexmonster (Json file, CSV file, Data Server, Custom source API) everytime I get stuck due to the memory issue. Please guide us.
 
Regards,
Vaibhav P.

Public
Maksym Diachenko Maksym Diachenko Flexmonster May 6, 2022

Hi, Vaibhav!

Thank you for your question.

Flexmonster Data Server uses better-optimized solutions for reading big files. The fact that it gets stuck due to memory issues can be a sign that your server hasn't enough RAM to work with such a big amount of data. With this in mind, we recommend allocating more RAM to your server, where you are running FDS or custom API.

Our sample custom API project is not optimized for reading large files, since its primary purpose is to demonstrate the data processing logic and how to work with the custom data source API protocol. A possible way to optimize reading big CSV data sets is to use ReadStream.

const fs = require('fs');
const readline = require('readline');

const dataFolder = process.argv[3] || './data';
const index = 'data';

//Initialize read stream
const rl = readline.createInterface({
    input: fs.createReadStream(`${dataFolder}/${index}.csv`)
});

//Read file line by line
rl.on('line', function (line) {
    console.log(line);
    // Do any 'line' processing
});

rl.on('close', function () {
    console.log(`Read stream closed"`);
});

Please see this article about reading big files: https://idkblogs.com/node/408/Read-Very-Large-File-7+-GB-file-in-Nodejs. This piece of information may be particularly interesting for you, as it shows how to read a TSV file format, which is very similar to CSV, and also uses the csvtojson library from our previous example.

Let us know if this approach works fine for you.

Best Regards,
Maksym

Public
Maksym Diachenko Maksym Diachenko Flexmonster May 18, 2022

Hi, Vaibhav!

Hope you are doing well.
We would like to know whether you tried out the approach with ReadSteram to read big files for custom API. 
Looking forward to your response.

Best Regards,
Maksym

Please login or Register to Submit Answer