large-data-volumes Questions

3

Solved

I need to write a C++ application that reads and writes large amounts of data (more than the available RAM) but always in a sequential way. In order to keep the data in a future proof and easy to ...
Soares asked 23/7, 2010 at 13:55

1

Solved

My question centers on some Parallel.ForEach code that used to work without fail, and now that our database has grown to 5 times as large, it breaks almost regularly. Parallel.ForEach<Stock_Lis...

7

Solved

Database : SQL Server 2005 Problem : Copy values from one column to another column in the same table with a billion+ rows. test_table (int id, bigint bigid) Things tried 1: update query updat...
Votive asked 22/9, 2010 at 18:49

4

Solved

In my admin section, when I edit items, I have to attach each item to a parent item. I have a list of over 24,000 parent items, which are listed alphabetically in a drop down list (a list of music ...
Thrombus asked 3/8, 2010 at 21:26

5

Solved

I am working on a project that must store very large datasets and associated reference data. I have never come across a project that required tables quite this large. I have proved that at least on...
Hashum asked 6/4, 2010 at 23:58

8

Solved

I have large amounts of data (a few terabytes) and accumulating... They are contained in many tab-delimited flat text files (each about 30MB). Most of the task involves reading the data and aggrega...
Whiles asked 30/5, 2010 at 5:6

7

Solved

The problem is, we have a huge number of records (more than a million) to be inserted into a single table from a Java application. The records are created by the Java code, it's not a move from ano...
Noblenobleman asked 4/5, 2010 at 14:15

5

Now we have a firebird database with 1.000.000 that must be processed after ALL are loaded in RAM memory. To get all of those we must extract data using (select * first 1000 ...) for 8 hours. What ...
Reverential asked 20/4, 2010 at 17:10

4

Solved

OK, so I am writing a program that unfortunately needs to use a huge data structure to complete its work, but it is failing with a "out of memory error" during its initialization. While I understan...

2

Solved

I have written a method insert() in which I am trying to use JDBC Batch for inserting half a million records into a MySQL database: public void insert(int nameListId, String[] names) { String sql...
Vanthe asked 9/2, 2010 at 8:5

3

Solved

There is a table phonenumbers with two columns: id, and number. There are about half a million entries in the table. Database is MySQL. The requirement is to develop a simple Java EE application, ...
Caber asked 1/2, 2010 at 23:27

3

Solved

What's the best practice for storing a large amount of image data in SQL Server 2008? I'm expecting to store around 50,000 images using approx 5 gigs of storage space. Currently I'm doing this usin...
Wrench asked 30/11, 2009 at 15:8

2

Solved

I gather that there basically isn't a limit to the amount of data that can be sent when using REST via a POST or GET. While I haven't used REST or web services it seems that most services involve t...
Few asked 30/9, 2009 at 21:24

6

Solved

I am given a task to convert a huge table to custom XML file. I will be using Java for this job. If I simply issue a "SELECT * FROM customer", it may return huge amount of data that eventually cau...
Domett asked 10/7, 2009 at 4:45

5

Solved

I'm working on a project which is similar in nature to website visitor analysis. It will be used by 100s of websites with average of 10,000s to 100,000s page views a day each so the data amount wi...
Batman asked 2/3, 2009 at 9:39

6

Solved

For a web application I'm developing, I need to store a large number of records. Each record will consist of a primary key and a single (short-ish) string value. I expect to have about 100GB storag...
Embracery asked 9/12, 2008 at 19:58

3

Solved

I have large data sets (10 Hz data, so 864k points per 24 Hours) which I need to plot in real time. The idea is the user can zoom and pan into highly detailed scatter plots. The data is not very c...
Courland asked 3/2, 2009 at 19:57

6

Solved

I have to dump a large database over a network pipe that doesn't have that much bandwidth and other people need to use concurrently. If I try it it soaks up all the bandwidth and latency soars and ...
Mattox asked 22/12, 2008 at 17:49

11

Solved

So I have a "large" number of "very large" ASCII files of numerical data (gigabytes altogether), and my program will need to process the entirety of it sequentially at least once. Any advice on st...
Salvatoresalvay asked 17/9, 2008 at 21:8

© 2022 - 2024 — McMap. All rights reserved.