write SAC data to file

Thanks for help, unfortunately I have another problem with for loop I think, will be great if I can get some help.

I run the code below, changed the file names but it writes same SAC file more than ones with 01.02.03.. numbers at file name.

code I wrote -------------------------------------------------

from obspy.core import read
from obspy.core import UTCDateTime
from obspy.sac import SacIO

Hi Ahu,

since you are looping over the traces of your stream object (st) you should replace st.write with tr.write. You can do the st.resample and st.trim outside of the loop as this will act on all traces contained in your stream object. So, something like this should work:

st = read("/media/disk-1/OBSPY/cut_SAC/van_dene/2011*", format="SAC")
st.resample(50.0)
EventOriginTime = UTCDateTime("2011-10-23T10:41:19")
st.trim(EventOriginTime, EventOriginTime+360)
for tr in st:
           fname = "%s.sac" % tr.id
           print fname
           stname = fname[1:5]
           comp = fname[11:12]
           for comp in fname:
                   if comp == "Z":
                           comp1 = 'BHZ'
                   elif comp == "N":
                           comp1 = 'BHN'
                   elif comp == "E":
                           comp1 = 'BHE'
           newfname = stname + "." + comp1 + "." +'SAC'
    tr.write(newfname, format="SAC")

Cheers,
Yannik

In addition to Yanniks comments,
the taking slices out of the string is unsafe (as the position of the
char you are looking for might shift). My earlier example to build the
file name was just simply making use of the complete trace id (SEED
style), just use the respective fields in the trace.stats dictionary if
you don't like this:

st = read("/media/disk-1/OBSPY/cut_SAC/van_dene/2011*", format="SAC")
st.resample(50.0)
EventOriginTime = UTCDateTime("2011-10-23T10:41:19")
st.trim(EventOriginTime, EventOriginTime+360)
for tr in st:
      newfname = "%s.%s.SAC" % (tr.stats.station, tr.stats.channel)
      tr.write(newfname, format="SAC")

Also, if you have large amounts of data, you should loop over files
instead of reading everything at once (meaning everything is in memory
at the same time!).

import glob
for file in glob.glob("/media/disk-1/OBSPY/cut_SAC/van_dene/2011*"):
    st = read(file)
    ...

best,
Tobias