MySQL help needed

set@i=0;
set@Max = select max(Entry) from Table;
while@i<@Max do
set@x=0;
while@x<@i do
insert into randomTable (Entity)
select @x;
set@x=@x+1;
endwhile;
set@i=@i+1;
endwhile;
 
then it should be
Select *
from client c
left outer join randomTable r on c.entry = r.entry
 
set@i=0;
set@Max = select max(Entry) from Table;
while@i<@Max do
set@x=0;
while@x<@i do
insert into randomTable (Entity)
select @x;
set@x=@x+1;
endwhile;
set@i=@i+1;
endwhile;

Thanks a mil @Christos will adapt and test hopefully during this morning. Need to first migrate MS Access DB's that the Ops people thought was a good idea, over to a Proper DB :hit:

Really appreciate all the help!
 
Thanks a mil @Christos will adapt and test hopefully during this morning. Need to first migrate MS Access DB's that the Ops people thought was a good idea, over to a Proper DB :hit:

Really appreciate all the help!
Don't forget to drop create if exists first and set max to 10 first to see if it works.
 
Don't forget to drop create if exists first and set max to 10 first to see if it works.

Thanks, will be using a 1K snippet of the table to test - so it wont be too resource intensive (and will make sure to ORDER BY RAND() instead of Entries ASC this time :)).

Thanks bud
 
How did the query go @Cespian ?

Ah sorry, forgot to give feedback... created a CRON and its running daily. Actually got some assistance from the IT dudes, and created the insert loop in PHP (much cleaner). I also edited the original table slighly by using a unique basket/transaction_id instead of unique customer_id's, with a "processed" column, so that each day, the cron job creates thousands of insert statements where "processed" IS NULL. The initial loop took just over 1H40 and Table2 grew to just over 2Bn rows. Ran a COUNT(*) yesterday afternoon and it was already on 3.6Bn. Its a mission remoting into this machine so I will check again on Friday. This campaign is running until end of June so yeah... expecting a good couple of Gb's worth of data in there. All I said was Good Luck to the auditors lol. Thanks again for your help bro.
 
Ah sorry, forgot to give feedback... created a CRON and its running daily. Actually got some assistance from the IT dudes, and created the insert loop in PHP (much cleaner). I also edited the original table slighly by using a unique basket/transaction_id instead of unique customer_id's, with a "processed" column, so that each day, the cron job creates thousands of insert statements where "processed" IS NULL. The initial loop took just over 1H40 and Table2 grew to just over 2Bn rows. Ran a COUNT(*) yesterday afternoon and it was already on 3.6Bn. Its a mission remoting into this machine so I will check again on Friday. This campaign is running until end of June so yeah... expecting a good couple of Gb's worth of data in there. All I said was Good Luck to the auditors lol. Thanks again for your help bro.
Thats a buttload of data. I limit my skills to about 50 million records then I need serious hardware with SSD drives to make magic happen!
 
Thats a buttload of data. I limit my skills to about 50 million records then I need serious hardware with SSD drives to make magic happen!

Thats what dreams are made of man... a Server loaded with SSD's. I have that issue when switching between my local instance (on my notebook) with SSD's to the replication machines :hit:
 
Back
Top