Bootstrap export options works for 5,000 rows and failed for 16,000 rows with network failure
Asked Answered
D

2

16

Below is the html which has 5,000 records. The export is working perfectly fine. However when the records are increased to 16,000 it says network failure for all exports. In console no error is found. I am not sure about the reason. Tested in Chrome.

<html>

<head>
  <link href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/css/bootstrap.min.css" rel="stylesheet" />
  <link href="https://cdnjs.cloudflare.com/ajax/libs/bootstrap-table/1.11.1/bootstrap-table.min.css" rel="stylesheet" />

  <script src="https://ajax.googleapis.com/ajax/libs/jquery/3.2.1/jquery.min.js"></script>
  <script src="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/js/bootstrap.min.js"></script>
  <script src="https://cdnjs.cloudflare.com/ajax/libs/bootstrap-table/1.11.1/bootstrap-table.min.js"></script>
  <script src="https://cdnjs.cloudflare.com/ajax/libs/bootstrap-table/1.11.1/extensions/export/bootstrap-table-export.min.js"></script>
</head>

<body>
  <table data-toggle="table" data-search="true" data-show-refresh="true" data-show-toggle="true" data-show-columns="true" data-show-export="true" data-minimum-count-columns="2" data-show-pagination-switch="true" data-pagination="true" data-id-field="id"
    data-page-list="[10, 25, 50, 100, ALL]" data-show-footer="false" data-side-pagination="client" data-url="https://jsonplaceholder.typicode.com/photos">
    <thead>
      <tr>
        <th data-field="id">Id</th>
        <th data-field="title">Title</th>
        <th data-field="url">URL</th>
        <th data-field="thumbnailUrl">Thumbnail URL</th>
      </tr>
    </thead>
</body>

</html>

With > 15,000 records

<html>

<head>
  <link href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/css/bootstrap.min.css" rel="stylesheet" />
  <link href="https://cdnjs.cloudflare.com/ajax/libs/bootstrap-table/1.11.1/bootstrap-table.min.css" rel="stylesheet" />

  <script src="https://ajax.googleapis.com/ajax/libs/jquery/3.2.1/jquery.min.js"></script>
  <script src="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/js/bootstrap.min.js"></script>
  <script src="https://cdnjs.cloudflare.com/ajax/libs/bootstrap-table/1.11.1/bootstrap-table.min.js"></script>
  <script src="https://cdnjs.cloudflare.com/ajax/libs/bootstrap-table/1.11.1/extensions/export/bootstrap-table-export.min.js"></script>
</head>

<body>
  <table data-toggle="table" data-search="true" data-show-refresh="true" data-show-toggle="true" data-show-columns="true" data-show-export="true" data-minimum-count-columns="2" data-show-pagination-switch="true" data-pagination="true" data-id-field="id"
    data-page-list="[10, 25, 50, 100, ALL]" data-show-footer="false" data-side-pagination="client" data-url="https://fd-files-production.s3.amazonaws.com/226483/16h4Vwxe1Wz9PZ5Gublomg?X-Amz-Expires=300&X-Amz-Date=20170906T130107Z&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAIA2QBI5WP5HA3ZEA/20170906/us-east-1/s3/aws4_request&X-Amz-SignedHeaders=host&X-Amz-Signature=5d705bfd19579c8a93ff81ee076363b2f36d1f5e4540b85f7c86de7643c17055">
    <thead>
      <tr>
        <th data-field="id">Id</th>
        <th data-field="title">Title</th>
        <th data-field="url">URL</th>
        <th data-field="thumbnailUrl">Thumbnail URL</th>
      </tr>
    </thead>
</body>

</html>
Doddered answered 31/8, 2017 at 16:25 Comment(16)
check this #19402138Intrastate
This seems to be missing things from the question. For example what do you mean Bootstrap export options? What records are you talking about? I only see some HTML with some CSS and JavaScript loaded and an empty HTML table.Jarid
No. it has 5,000 records..you didn't notice - data-url="jsonplaceholder.typicode.com/photos". When you add more json data i.e 16,000 export ends with network failure errorDoddered
Seems like a memory problem. If you have access to the server's settings, you may try to tweak memory and file sizes rights. The other obvious options seem to create your exported file in several steps (paginate the export if you will ; maybe by increments of 5000) or use a swap file on the server. Maybe you could optimize this by exporting using the database instead of the view?Cerise
It is running in local computer..static data and static html. i have enough memory - 8gb in my computer.Doddered
@Doddered Please provide a URL fit to be placed in data-url="..." that reproduces the issue you are reporting. The current URL in data-url does not reproduce the problem.Bamberg
I saw this on chrome and firefox, seems fine. Cant reproduce the problemOstwald
Louis, you can use any data-url.i just put as a reference..anything with 16,000 records will failDoddered
@Doddered It is up to you to provide the conditions that reproduce the problem. Not up to readers to fill in the blanks.Bamberg
@Doddered The server for your > 15,000 example responds with a 403 (Forbidden) status code.Bamberg
@Doddered This site has a service for service JSON files: my-json-server.typicode.com The front page for the site has instructions as to how to create a github repo to serve JSON data.Bamberg
I am not sure how to do it..I got failed..can you please do it..i tried several times...Doddered
However uploaded the file to ufile.io/2pzyd for your referenceDoddered
@Doddered the data url returns a 403 forbiddenedOstwald
@Doddered I think the main problem is with the tableExport plugin the code has lot of loops in it. It seriously is not fit for a large dataset. Can you tell me what type of output you are expecting JSON and CSV should be pretty much simple. If you want that I can post that as an answerSangsanger
As Louis already pointed the problem is not number of items, but just different source of data. AWS-S3 - just blocks access. Provide 2 links to your AWS one with even 100 rows and another with 16K or whatever you want, but just provide links which are working but not permission deniedHamford
A
2

Try doing the following:

1.) Download the library files instead of using a CDN.

2.) Increase your page time-out time on your AWS server. It's possible that you don't have enough time to process all those records.

3.) It's possible that you're hitting up against some unknown client-side restriction, like javascript.options.mem.max being 128MB. (16k records may hit that.)

4.) Try another server. There might be restrictions on AWS that you can't control (e.g. memory or "time-to-live" for your connection), but if you set up your own personal dedicated server for testing, you could rule that out.

5.) Disable your "ALL" option. Do you really want people to pull 16k records at once?

6.) As a last resort, try making a server-side pagination script.

Atelectasis answered 30/5, 2018 at 17:28 Comment(0)
O
1

This looks to be a problem with the S3 request expiration:

<?xml version="1.0" encoding="UTF-8"?>
<Error>
   <Code>AccessDenied</Code>
   <Message>Request has expired</Message>
   <X-Amz-Expires>300</X-Amz-Expires>
   <Expires>2017-09-06T13:06:07Z</Expires>
   <ServerTime>2018-06-02T00:00:15Z</ServerTime>
   <RequestId>396C37F87B33C933</RequestId>
   <HostId>pg4uY75WW5p07yvAtqhEFvvKi0FreyHlNo/gJ329aRYHP9/KgzkVxRVkH4lZkwPtw7bLET+HPl8=</HostId>
</Error>
Omidyar answered 2/6, 2018 at 0:4 Comment(1)
That looks like it'd make sense. 300ms timeout will definitely get hit if there's 15k records.Atelectasis

© 2022 - 2024 — McMap. All rights reserved.