1 year ago

#370351

test-img

Farid Arshad

Large dataset workaround on browser

I am creating a function which is able to store large datasets, currently i have tested with 200k Records which is about 200MB. In Production , datasets will have about 5million and above which will result in 1gb or more. The function i created below is a simple function which only has 2 records for demo purposes, but its similar to the function i have created in my actual script

function dynamicInput(){
    var base64 = function (s) { return window.btoa(unescape(encodeURIComponent(s))) }
    var format = function (s, c) { return s.replace(/{(\w+)}/g, function (m, p) { return c[p]; }) }    

    var staticString = 'NAME IS : "{Name}" : '
    var data = "SampleData1"
    var ctx = ""
    var output = ""

    // This is Dynamic -> 3Million Records
    var sp_1 = "John"
    ctx = {
        Name:sp_1
    };
    output += format(staticString,ctx)

    // This is Dynamic -> 3Million Records
    var sp_2 = "Wick"
    ctx = {
        Name:sp_2
    };
    output += format(staticString,ctx)

    console.log(output)
}

It will keep appending on asyc api call to output variable. i have looked into IndexedDB but before i really dive into indexedDB solution and explore the potential possibility, was wondering if there is any way or solution to make it able to support large data on JS alone , on browser as i have restriction to store it in backend server.

The usecase is : websiteA.com is created by team A, i am creating a functionality which uses websiteA api call to retrieve those data on websiteB and then export it as xml content[website B is created by team B which i dont access to its server]. website B have a functionality which allow us to create our custom script and appending those script to their web.

javascript

arrays

indexeddb

0 Answers

Your Answer

Accepted video resources