0

This is my first application in HTML5. Application read local file, splits it into chunks and send to server, this works perfectly.

Part of program:

    var reader = new FileReader();

    // Read data from chunk
    reader.onloadend = function(evt) {
      if (evt.target.readyState == FileReader.DONE) { // DONE == 2
        blob_content = evt.target.result; // This problem, chunk != blob_content
      }
    };

    chunk = blob.slice (start, end);
    reader.readAsBinaryString(chunk);
    // readAsBinaryString same

    xhr.open ("post", "upload.php", false);
    xhr.send (blob_content); // This send data about 30% larger
    // xhr.send (chunk); // This send data correctly

I need to work with data in a chunk (e.g. encrypt) and do not know how. My fictional function generates data about 30% larger.

Maybe problem with mime type?

Thanks.

Sorry for my excellent English.

5
  • Why do you do readAsBinaryString at all? For encryption (which should be done by the connection btw) or any other manipulation, you most likely would want to use readAsArrayBuffer. Commented Apr 23, 2014 at 21:04
  • Encryption/decryption is only on client side, on server is files encrypted. readAsArrayBuffer return the same result, what wrong? I certainly did not understand anything. Commented Apr 23, 2014 at 21:40
  • For one thing, those functions are asynchronous! You will definitely need to put xhr.send in the onloadend callback; I don't know how elsewise anything could be sent. Commented Apr 23, 2014 at 21:42
  • OK, thanks. I go sleep, tomorrow will look at it. Commented Apr 23, 2014 at 21:45
  • Still some problems, there is a difference whether I'm reading a text file with diacritics already problems, without diacritics seems OK. Tomorrow I will compare the resulting files via hex editor. Commented Apr 24, 2014 at 20:42

1 Answer 1

2

Thanks to Bergi.

I completely rewritten code to web workers and ArrayBuffer, ArrayBufferView and synchronously by events.

Main code

(Project at GitHub)

  function upload_file () {

    chunk = blob.slice (start, end);
    index++;

    var worker_reader = new Worker('worker_reader.js');
    var worker_uploader = new Worker('worker_uploader.js');

    worker_reader.onmessage = function (event) {
      uload_array={"size": blob.size, "name": blob.name, "content": event.data, "index": index};
      document.getElementById("back_message").innerHTML = "Writing " + blob.name + " " + index;
      worker_reader.terminate();
      return worker_uploader.postMessage(uload_array);
    }

    worker_uploader.onmessage = function (event) {
      if (index < chunks_total) {
        start = end;
        end = start + BYTES_PER_CHUNK;
        var percentageDiv = document.getElementById ("percent");
        var progressBar = document.getElementById ("progressBar");

        percentageDiv.innerHTML = "0%";
        progressBar.max = chunks_total;

        progressBar.value = index;

        percentageDiv.innerHTML = Math.round (index/chunks_total * 100) + "%";
        document.getElementById("back_message").innerHTML = "Reading " + blob.name + " " + index;
        worker_uploader.terminate();
        return upload_file ();
      }
      else {
        upload_merge ();
      }
    }
    document.getElementById("back_message").innerHTML = "Reading " + blob.name + " " + index;
    worker_reader.postMessage(chunk);
  }

Bergi again thanks.

EDIT// Add terminate worker and respawn because Firefox have bug in Garbage Collector. Chromium without this work properly. (With file over 200GB)

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.