Welcome to the new Gigaspaces XAP forum. To recover your account, please follow these instructions.

Ask Your Question
0

Loading large data set.

Yes lots of questions... :D I'm preparing for a demo.

I have a SQL table with 3 000 000 records and i would like to load in a partitioned space.

What is the best way of doing so? I have a couple of machines ranging from 2-4 GB of ram. How many machines would I need?

Each row can be about 550 bytes So if the calculation is a straightforward as...

(550 bytes x 3 millions rows) / 1 GB = 15 GB So I would need 6 or so machines? Or is a POJO allot smaller when serialized?

{quote}This thread was imported from the previous forum. For your reference, the original is [available here|http://forum.openspaces.org/thread.jspa?threadID=3537]{quote}

asked 2010-11-08 09:28:28 -0500

infectedrhythms's avatar

updated 2013-08-08 09:52:00 -0500

jaissefsfex's avatar
edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
0

answered 2010-11-08 09:46:21 -0500

shay hassidim's avatar
edit flag offensive delete link more

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools

1 follower

Stats

Asked: 2010-11-08 09:28:28 -0500

Seen: 53 times

Last updated: Nov 08 '10