Log in

View Full Version : Resuming quest to open a massive text file


Aeolwind
05-13-2013, 09:04 AM
It has been about 3 years since I last tried, I'm hoping a new program has popped up. I have a 4ish gb log file from EQ that I would like to open. Suggestions? Notepad++ chokes.

quido
05-13-2013, 09:13 AM
What are you looking to do once you have it open?

I sometimes have to open logs that are a gig at work and used to use Dreamweaver to do so since I had it open anyways. It works but is slow as hell. Most likely, though, there is some upper bound to the amount of data that can be loaded.

Getting good with the unix commands like grep, tail, etc, will be a big help for you in this right most likely. Regular expressions ftw!

Vega
05-13-2013, 09:20 AM
Could also break it up with a scripting language to smaller pieces.

Estu
05-13-2013, 09:26 AM
I use Textpad for all of my text and small coding needs, but I've never had to open a file that big so I don't know how it would react. Your best bet, as other posters have said, is writing a program to break it up into smaller pieces or to just straightaway do whatever it is you need to do with it.

Nirgon
05-13-2013, 09:41 AM
Unix less command?

How big you talkin'?

You could write something small that pipes in X-Y indexed characters at a time to another text pad?

Aeolwind
05-13-2013, 10:56 AM
What are you looking to do once you have it open?

I need to read it =D. It has just about everything I did in EQ from 99 thru about about 2006 in it. Every quest, every raid mob, every face plant, every cyber session.

Textpad chokes and dies. I'd tried breaking it down before, but it corrupted a copy in the process. Tried Dreamweaver as well and it tanked. Filesplitter exploded.

Going to try Chainsaw.

Itap
05-13-2013, 11:01 AM
I need to read it =D. It has just about everything I did in EQ from 99 thru about about 2006 in it. Every quest, every raid mob, every face plant, every cyber session.

http://ironwolf.dangerousgames.com/blog/wp-content/lolz/images/hurrithreadgy6.jpg

Swish
05-13-2013, 11:09 AM
Scribd? Google docs? Might take forever to upload but if its online it might be easy to view.

falkun
05-13-2013, 11:11 AM
I've opened files as large as 1.5gb with textpad. These same files crashed notepad, wordpad, notepad++, editplus. Have you tried gedit? Beyond that, I would consider trunchating the file.
It may sound strange, but have you considered reading the file into matlab? My large data files were CSV, but could you try reading it into a 1xsuperlong array in matlab? Could then manipulate the array into smaller arrays and output smaller text files.

feanan
05-13-2013, 12:10 PM
glogg

http://glogg.bonnefon.org/

Aeolwind
05-13-2013, 12:22 PM
glogg

http://glogg.bonnefon.org/

Promising.

norova
05-13-2013, 12:24 PM
Haven't had to open anything larger than 1GB with this, but try SublimeText (http://www.sublimetext.com/)

Ephi
05-13-2013, 12:40 PM
Haven't had to open anything larger than 1GB with this, but try SublimeText (http://www.sublimetext.com/)

Sublime will choke on a large file (loads it all into memory). :(

I usually just use less, but then again I'm assuming unix commands don't count.

Sadre Spinegnawer
05-13-2013, 12:58 PM
I do hope you realize some of us are masturbating to this thread. Just sayin.

Aeolwind
05-13-2013, 01:07 PM
I do hope you realize some of us are masturbating to this thread. Just sayin.

3rd bout of trying to open this thing. No one is jonesing for climax more than me I don't think. It will need to be sanitized obviously to protect the innocent.

Unix commands scare me on this file. They corrupted a copy.

August
05-13-2013, 01:42 PM
The problem with trying to load a file such as this is that most programs are going to load the entirety of the file into memory. How much memory do you have? IF its not significantly over 4GB, you're hosed.

You could also just stop trying to find the solution and create one yourself:

void ReadAndWrite()
{
System.IO.StreamReader InFile= new System.IO.StreamReader("c:\\myeqlog.txt");



for(int n=0; n<1000; n++)
{
System.IO.StreamWriter OutFile= new System.IO.StreamWriter(@"C:\MyEQLog"+n);
Count = 0;
while((line = file.ReadLine()) != null && Count <1000)
{
Outfile.WriteLine (line);
Count ++;
}
OutFile.Close();
}
InFile.Close();
}

This is purely from general memory. Basically, open a stream on the file. Read in 1000 lines, output a thousand lines, increment your logname - rinse and repeat. This will let you choose how large of file chunks you want ( by changing count). You probably also need to generate a new file name somewhat diffrerently - i'm not sure the way i did it would compile and I'm lazy.

-Tomtee

webrunner5
05-13-2013, 02:20 PM
What about some compression zip file scheme? Backed up of course.

nilbog
05-13-2013, 02:25 PM
http://www.gdgsoft.com/files/gsplits.exe download and install, when prompted, choose to integrate into shell
step 1. make a copy of the file you're working on. create a new folder for testing (i.e. desktop /txttest) paste copy of original there
step 2. load the copy using the menu to the left on gsplit. Original File option
step 3. destination folder. i used the same one i put the copy in
step 4. type and size. size 20.00 in MB
step 5. filenames, you can use your own naming. want .txt. under piece name mask: disk{num}.txt
step 6. split file!

Would try this

August
05-13-2013, 02:31 PM
http://www.gdgsoft.com/files/gsplits.exe download and install, when prompted, choose to integrate into shell
step 1. make a copy of the file you're working on. create a new folder for testing (i.e. desktop /txttest) paste copy of original there
step 2. load the copy using the menu to the left on gsplit. Original File option
step 3. destination folder. i used the same one i put the copy in
step 4. type and size. size 20.00 in MB
step 5. filenames, you can use your own naming. want .txt. under piece name mask: disk{num}.txt
step 6. split file!

Would try this

Definitely try this. More than likely a program that does something similar (and much more sophisticated) than the sample above.

Lyra
05-13-2013, 02:36 PM
It will need to be sanitized obviously to protect the innocent.

This thread made me uncomfortable. What server were you on?

Aeolwind
05-13-2013, 03:20 PM
The problem with trying to load a file such as this is that most programs are going to load the entirety of the file into memory. How much memory do you have? IF its not significantly over 4GB, you're hosed.

Gigabytes? This is the PC I built in 2000. I added an HD at some point because it was getting too big to house just the game, not to mention all my log files lol. It doesn't even have a USB Port. I made a 'bad call' on thinking firewire would be the most widely accepted protocol.

I'll be getting it over to my 'new' rig (Built 2007), but it is capped at 4gb. 2nd ramset shorted out. Installing sticks on that pair just causes a BSOD crash loop.

This thread made me uncomfortable. What server were you on? Emarr. Better or worse?

Lyra
05-13-2013, 03:22 PM
Emarr. Better or worse?

Much better :p

Aeolwind
05-13-2013, 03:30 PM
Much better :p

Glad I lied then.

boudicca
05-13-2013, 03:53 PM
http://www.vim.org/download.php

http://gnuwin32.sourceforge.net/packages/less.htm

getsome
05-13-2013, 07:03 PM
some troubleshooting questions.

Are you trying to open the file on this 2000 circa PC?

if so what operating system is it?

What file system is the hard drive where the txt file is stored? (You can right click the drive and select properties to look at the files system.)

From your earlier post it sounds like your old EQ machine is still functioning. In the interests of maintaining the integrity of your important log file you may want to create a copy before attempting anything. I recommend connecting the hard drive that contains the file to a USB hard drive adapter and make a copy of your log file to another media.

As for editors, I tested out a few.

I created a 4 GB test file using the following batch file.

echo "This is just a sample line appended to create a big file. " > dummy.txt
for /L %%i in (1,1,26) do type dummy.txt >> dummy.txt

I currently use emEditor and it opened the file.

I also tried a free program that worked. http://www.hhdsoftware.com/free-hex-editor

http://i43.tinypic.com/n4ba6v.png

letsallkillandy
05-13-2013, 08:05 PM
There's also a high probability with a file size that large (and old) that it may have gone corrupt. Im frankly impressed you've accumulated 4gbs of TEXT. The largest log file i've seen was 1.5gbs and it barely opened.

sabinrf24
05-13-2013, 08:28 PM
I recommend logparser 2.2 from Microsoft, of all sources. Pick whatever number increments make sense for you. I use this on 10-20GB logfiles regularly and I find it to be one of the best ways to handle them on Windows.

C:\Program Files (x86)\Log Parser 2.2>logparser -i:TEXTLINE "SELECT * FROM 'F:\P
rogram Files (x86)\Sony\EverQuest\Logs\eqlog_Character_project1 999.txt' WHERE Index
> 0 AND INDEX < 10000"

C:\Program Files (x86)\Log Parser 2.2>logparser -i:TEXTLINE "SELECT * FROM 'F:\P
rogram Files (x86)\Sony\EverQuest\Logs\eqlog_Character_project1 999.txt' WHERE Index
> 10000 AND INDEX < 20000"

logparser -h to see the export options, for CSV it would be:

C:\Program Files (x86)\Log Parser 2.2>logparser -i:TEXTLINE -o:CSV "SELECT * FROM 'F:\P
rogram Files (x86)\Sony\EverQuest\Logs\eqlog_Character_project1 999.txt' WHERE Index
> 10000 AND INDEX < 20000" > C:\folder\test.csv

SCB
05-13-2013, 08:49 PM
I am so excited for you to post these logs to the forums as soon as you can get into the file.

quido
05-13-2013, 08:53 PM
yeah honestly if you're trying to do this on an older computer, I'd just jimmy the HD out of it and stick it in a newer machine

Thulack
05-13-2013, 09:06 PM
Good investment for 20 bucks is a IDE to USB device. I have one and love the ability to just take a old IDE HD out of a rig and in 5 seconds have a external HD out of it and can get whatever info off it at a reasonable pace.

Pringles
05-13-2013, 09:17 PM
Unix / linux / osx: more / less / vi / cat will work just fine on 4gb file, I do it daily on logs much larger than this.

sawin
05-13-2013, 09:32 PM
I also have been looking for a program to open very large files. I have considered using http://www.ultraedit.com/

There is a trial and it boasts the capacity to edit files over 4GB. I wish I could recommend it, but as I haven't yet tried it I'm unsure how well it would work for you.

vulzol
05-13-2013, 09:38 PM
There's also a high probability with a file size that large (and old) that it may have gone corrupt. Im frankly impressed you've accumulated 4gbs of TEXT. The largest log file i've seen was 1.5gbs and it barely opened.

Hate to be a downer but this sounds probable.

Unfadable
05-13-2013, 11:33 PM
I also have been looking for a program to open very large files. I have considered using http://www.ultraedit.com/

There is a trial and it boasts the capacity to edit files over 4GB. I wish I could recommend it, but as I haven't yet tried it I'm unsure how well it would work for you.


Use this at work and works well. I think the largest file I've opened with it was a 2.5gig csv file.

Adachi
05-13-2013, 11:36 PM
Under Unix you can use the split command.

split -l 1000000 file.txt

This would split the file every 1,000,000 lines into files named 'xaa', 'xab', 'xac', etc..

And if you need to rejoin the files after editing them, simply use 'cat'

cat xaa xab xac > newfile.txt

But if you just want to view the file, I'm pretty sure 'less' would work.

Oh, and vim > *
:D

Hackscendence
05-14-2013, 12:19 AM
Line breaks are still line breaks when the file is converted to CSV.

Copy file, paste somewhere, change extension to CSV, Use This (http://www.fxfisherman.com/downloads/csv-splitter-1.1.zip), set max number of lines per file to whatever you'd prefer, convert resulting file extensions back to txt.

webrunner5
05-14-2013, 10:57 AM
I think if he breaks the files up this thing is never going to work. Like has been said one stupid corrupt sector and yarcarbluee. :eek: This needs to be transfered complete.

SirAlvarex
05-14-2013, 07:03 PM
I concur with everyone suggesting "less." It works by buffering segments of the file and does not edit it. I use it to view multi-gb files all the time.

Although split does seem like a nice command.

In less you can search through the file with "/" and a regex. Although on a 4GB file that might not be most idea lol. Just use grep for that.

Aeolwind
05-14-2013, 10:13 PM
[Wed Jul 31 21:56:37 2002] Gazgaz tells the guild, 'Me slowed uz The Idol of Rallos Zek'
[Wed Jul 31 21:56:37 2002] Raelven tells the guild, 'weakened'
[Wed Jul 31 21:56:37 2002] Karitta tells the guild, '#8 Comple healing ------> Sarsippius #8'
[Wed Jul 31 21:56:37 2002] Hoozyur tells the guild, 'get The Idol of Rallos Zek a wheelchair, his ass is CRIPPLED'
[Wed Jul 31 21:56:37 2002] Tuan tells the guild, '[1] --CH-- Sarsippius'
[Wed Jul 31 21:56:37 2002] Teer tells the guild, '#2 CH <<< Sarsippius >>> #2~~~~~~~~ #2 ~~~~~~~~~ #2'
[Wed Jul 31 21:56:37 2002] Tomatos tells the guild, 'Elixier on Sharlene for Assist Heal, Wating for Big Heal !!'
[Wed Jul 31 21:56:40 2002] Welcome to EverQuest!
[Wed Jul 31 21:56:40 2002] You have entered Nektulos Forest.
[Wed Jul 31 21:56:41 2002] GUILD MOTD: Dabiggun - CT, Creator and HP PWNED!!!! AoW NOW!
[Wed Jul 31 21:56:41 2002] Autojoining channels...

Just at taste

Thulack
05-14-2013, 10:27 PM
So who gets the reward for the right program? :) Grats on getting it to work.

liveitup1216
05-14-2013, 10:28 PM
[Wed Jul 31 21:56:37 2002] Gazgaz tells the guild, 'Me slowed uz The Idol of Rallos Zek'
[Wed Jul 31 21:56:37 2002] Raelven tells the guild, 'weakened'
[Wed Jul 31 21:56:37 2002] Karitta tells the guild, '#8 Comple healing ------> Sarsippius #8'
[Wed Jul 31 21:56:37 2002] Hoozyur tells the guild, 'get The Idol of Rallos Zek a wheelchair, his ass is CRIPPLED'
[Wed Jul 31 21:56:37 2002] Tuan tells the guild, '[1] --CH-- Sarsippius'
[Wed Jul 31 21:56:37 2002] Teer tells the guild, '#2 CH <<< Sarsippius >>> #2~~~~~~~~ #2 ~~~~~~~~~ #2'
[Wed Jul 31 21:56:37 2002] Tomatos tells the guild, 'Elixier on Sharlene for Assist Heal, Wating for Big Heal !!'
[Wed Jul 31 21:56:40 2002] Welcome to EverQuest!
[Wed Jul 31 21:56:40 2002] You have entered Nektulos Forest.
[Wed Jul 31 21:56:41 2002] GUILD MOTD: Dabiggun - CT, Creator and HP PWNED!!!! AoW NOW!
[Wed Jul 31 21:56:41 2002] Autojoining channels...

Just at taste

Is it harder to go through those logs with or without a massive erection.

getsome
05-15-2013, 01:06 AM
I hope you had your filters off.

azxten
05-15-2013, 02:29 AM
Put your file on a linux system.

split -l 10000 filename

This will split your file every 10000 lines into files xaa, xab, etc. I can give you a more sophisticated command if you'd like that names the files according to the timestamps at the start or whatever.

Use the operating system where everything is a file if you're working with files.

Gwence
05-15-2013, 03:07 PM
looks like you wiped

binding in nek forest? odd choice

Aeolwind
05-15-2013, 04:56 PM
looks like you wiped

binding in nek forest? odd choice

Just logging in. I was PLing a monk on my 2nd box in SolB, I fell asleep the night before at the keyboard. If I can get the '1st half' open, I'll post the night.