Java – How do I read a large Base64 file (150MB) on an Android application?

How do I read a large Base64 file (150MB) on an Android application?… here is a solution to the problem.

How do I read a large Base64 file (150MB) on an Android application?

I’m trying to read a large base64 text file with a size (~150MB) on an Android app.

The file contains the JSON string I need to decode and convert it to a JSON object and use it in my application. The problem is that I get an exception when I try to read this data Out of Memory.

The app needs to work offline, so I need to download the full data.

The code is as follows:

    String localPath = getApplicationContext().getFilesDir().getPath().toString() ;
    String key = "dataFile.txt" ;

StringBuilder text = new StringBuilder();
    File file=new File(localPath+"/"+ key);

byte fileContent[] = new byte[3000];

try ( FileInputStream fin = new FileInputStream(file)) {
        while(fin.read(fileContent) >= 0) {
            byte[] data = Base64.decode(fileContent, Base64.DEFAULT);
            try {
                text.append(new String(data, "UTF-8"));
            } catch (UnsupportedEncodingException e) {
                e.printStackTrace();
            }
        }
        obj = new JSONObject(text.toString());
    }catch (Exception e){
        e.printStackTrace();
    }

How to read such a file?

Solution

You are trying to read the entire file into the text object by reading the file, iterating over it, and appending each line to the text. You create a JSONObject from a text object, which is really only useful for your application in the last step.

Here, when your code reaches obj = new JSONObject(text.toString()); row, you have filled this complete file in memory as a test object with almost the size of the input file. Then, you create JSONObject for this text object.

You can take the following actions to eliminate this issue:

  1. Use BufferedReader to read the file as a block (optional). Using read() can be a bit slow, but it’s better to have a buffer.
  2. Iterate over the file and put entries into the text object in batches of 1000 or 10000 .
  3. Prepare JSONObject from text and attach it to obj.
  4. Clear the text object before processing the next batch, and then repeat the whole process.

By doing so, you read only a small portion of the memory Chinese, and the text object acts as a buffer, consuming only a small amount of memory.

Here is the sample code fragment

:

int counter = 0;
String temp = null;
final int BATCH_SIZE = 1000;
try (BufferedReader br = new BufferedReader(new FileReader(path)) {

while ((temp = br.readLine()) != null) {
        text.append(temp);
        ++counter;

/* Process In Batches */
        if(counter % BATCH_SIZE == 0) {
            /* Prepare & Append JSON Objects */
            obj = prepareAppendJSON(text.toString(), obj);
            /* Clear text */
            text.setLength(0);
        }
    }

/* Last Iteration */
    obj = prepareAppendJSON(text, obj);
    text = new StringBuilder();

} catch (IOException ex) {
    ex.printStackTrace();
}

Related Problems and Solutions