1/26 I just transfered several thousand files from a Mac to a PC. Macs don't
have file extensions so is there any UNIX utility (or PC utility) that
will differentiate between word and excel files so that I can
programmatically add the .xls and .doc extensions? "file" simply says
"Microsoft Word" document.
\_ write an applescript. Unfortunately, I don't know the syntax for
grabbing creator codes from the finder. Maybe someone else on
here does. Applescript documentation is horrible.
\_ What I ended up doing was using perl to traverse the directory
hierarchy. I changed all :'s to -'s (PCs don't like :'s) and add
.xls extensions to each file. This is somewhat lame, but 95% of
the files are excel files and I was able to do this in less than
10 minutes (I already had the perl readdir traversal code handy,
I just had to find it)... of course, I did all this before asking
on the motd bc i didn't think i'd find any useful information here.
I'd still be interested in a nice solution (ideally, someone would
tell me that I just need to upgrade my version of file)
\_ you should rename the files while they are on the Mac.
http://www.versiontracker.com/dyn/moreinfo/macosx/14404
\_ link not found and this is System 7 we're talking about.
\_ DEAR GOD WHY???
Uh. can you transfer them to a more modern mac first?
\_ It only took him 10 minutes this way.
\_ 'file' depends on the 'magic number' which is stored in a file.
you only really needed an updated magic file.
\_ This came up on google under "automatically macintosh file
extensions". I bet you can find more:
http://peccatte.karefil.com/software/MacNames/MacNamesEN.htm
\_ using perl, you can issue the file command recursively through
your dirtree and dump the results into a text file. You can
then open up the text file and issue a rename accordingly to
each of the files it finds. Should take about an hour to write
up in perl. -williamc
\_ An hour? It should take about 5 minutes in any shell. On a
bad day. Between commercials and surfing. This is a good
time for sed/awk for those too lazy to use the search/replace
function on their text editor.
\_ Hmmm, apparently you don't do any "shell scripting" on a PC
in Windows/DOS.
This can be done in five minutes on a UNIX machine, but
with Windows through a DOS VM box (assuming you're not
using cygwin) I'd have to spend some time testing it to
make sure it was functioning correctly. If using
the find command you'd have to figure out how to
dump the information into an intermediate text file
and parse the contents. Regardless, I'd like to see
your five minuts solution posted on the motd, it will
be informative. Also, wtf does does your text editor
have to do with identifying binary files??? Reading
comments like these makes you think that the poster
didn't understand the problem domain. -williamc
\_ i'm not the guy above, but you might want to consider
posting anonymously, lest people associate "williamc"
with "talking out of his ass". cygwin often works
*just fine*, and i don't see any reason to believe it
wouldn't in this case. as the OP has already mentioned
though, 'file' doesn't give enough information to
distinguish between word and excel files. but even
supposing it did, you would either want to use sed/awk
to massage the output into an easier to parse format,
or use a text editor to do it if you don't know how
to use sed/awk. perhaps you should keep quiet instead
of making an ass of yourself next time.
\- ObEmacsDiredMode --psb
\_ What I ended up doing was using perl to traverse the directory
hierarchy. I changed all :'s to -'s (PCs don't like :'s) and add
.xls extensions to each file. This is somewhat lame, but 95% of
the files are excel files and I was able to do this in less than
10 minutes (I already had the perl readdir traversal code handy,
I just had to find it)... of course, I did all this before asking
on the motd bc i didn't think i'd find any useful information here.
I'd still be interested in a nice solution (ideally, someone would
tell me that I just need to upgrade my version of file) |