Neries event feed


I’m looking to get a relatively continuous feed of earthquake locations from Neries ( using obspy. With subsequent calls to client.getEvents I typically get http socket timeout errors after letting this run a while.

Is there a better way to get a near real-time feed of events from Neries?

Many thanks!

Here is the barebones of what I am trying to accomplish:

import time
from obspy.neries import Client
from obspy import UTCDateTime

interval1 = 1800.0
interval = 30.0


client = Client(user=‘’)
while True:
print 'Getting events from ’ + str(t1) + ’ to ’ + str(t2) + ’ … ’
events = client.getEvents(min_datetime=t1,max_datetime=t2,format=‘catalog’)
nev = len(events)
print 'Number of events: ’ + str(nev) + ’ … ’
del events

Hi Josh,

You could add another loop in there to try again if an error is encountered:

while True:
print 'Getting events from ’ + str(t1) + ’ to ’ + str(t2) + ’ … ’

for i in xrange(max_attempts):


events = client.getEvents(min_datetime=t1,max_datetime=t2,format=‘catalog’)

except NameOfExceptionYoureGetting:

print ‘Time out error. Trying again (%d)’ % (i)




print ‘Could not retrieve data. Skipping this time interval’

nev = len(events)
print 'Number of events: ’ + str(nev) + ’ … ’

The else clause for the ‘try’ ‘except’ block executes when the things in ‘try’ don’t raise an exception.

The else clause for the ‘for’ loop executes if the loop exits naturally (not via ‘break’).


Thanks Leo. That’s a good idea. I’m still coming up to speed on python, but the exception that gets raised seems to vary between OS (Debian versus OSX). Debian throws this:

Traceback (most recent call last):
File “./”, line 23, in
events = client.getEvents(min_datetime=t1,max_datetime=t2,format=‘catalog’)
File “/home/stach/local/src/obspy-dev/obspy/neries/”, line 81, in wrapper
v = f(*args, **new_kwargs)
File “/home/stach/local/src/obspy-dev/obspy/neries/”, line 279, in getEvents
data = self._fetch("/services/event/search", **kwargs)
File “/home/stach/local/src/obspy-dev/obspy/neries/”, line 160, in _fetch
response = urllib2.urlopen(remoteaddr, timeout=self.timeout)
File “/usr/lib/python2.7/”, line 127, in urlopen
return, data, timeout)
File “/usr/lib/python2.7/”, line 401, in open
response = self._open(req, data)
File “/usr/lib/python2.7/”, line 419, in _open
‘_open’, req)
File “/usr/lib/python2.7/”, line 379, in _call_chain
result = func(*args)
File “/usr/lib/python2.7/”, line 1211, in http_open
return self.do_open(httplib.HTTPConnection, req)
File “/usr/lib/python2.7/”, line 1184, in do_open
r = h.getresponse(buffering=True)
File “/usr/lib/python2.7/”, line 1034, in getresponse
File “/usr/lib/python2.7/”, line 407, in begin
version, status, reason = self._read_status()
File “/usr/lib/python2.7/”, line 365, in _read_status
line = self.fp.readline()
File “/usr/lib/python2.7/”, line 447, in readline
data = self._sock.recv(self._rbufsize)
socket.timeout: timed out

I was hoping there would be a more data-driven solution than the infinite while loop that tries to pull events. With this method, the “near real-time” granularity is determined by how often the pull requests are made (how small interval is).

Thanks for your help!

Josh, that is the easiest solution I can think of. Might not be top notch engineering but does its purpose right?

Regarding different exceptions, you can add another “except” after the first one for the other exception. Something like this:

import socket

events = …
except socket.timeout:

except OSXErrorName:


The execution stops at the first except that matches the right exception.


Thanks for the suggestions. The event feed was getting very far behind so I inquired with Neries and the response points to moving to FDSN web services:

if you want to access data by web service, you can use:

it follow the fdsn web service specification

support export type are text,quakeml and geojson, example of a query (last 10 events) :

However, this doesn’t appear to be compatible with the latest obspy.fdsn

from obspy.fdsn import Client
Traceback (most recent call last):
File “”, line 1, in
File “/data/SFW/obspy-dev/obspy/fdsn/”, line 116, in init
File “/data/SFW/obspy-dev/obspy/fdsn/”, line 1018, in _discover_services[“event”] = WADLParser(wadl).parameters
File “/data/SFW/obspy-dev/obspy/fdsn/”, line 93, in init
File “/data/SFW/obspy-dev/obspy/fdsn/”, line 183, in add_parameter
“doc_title”: doc_title.strip(),
AttributeError: ‘NoneType’ object has no attribute ‘strip’

Is the obspy.fdsn client only compatible with IRIS web services?

Hey Josh,

I was not aware that the seismicportal has an active FDSN implementation.

This is a bug (either in ObsPy or the WADL file from seismic portal is not fully valid). No matter what, the WADL parser in ObsPy should be flexible enough to handle it. We unfortunately have lots of issues with the WADL files because most FDSN web service implementations differ slightly from each other making it hard to write a generic parser for them.

I opened a new issue on Github to track the bug:



Great, thanks. It looks like the implementation is not complete yet since there is no link to /station or /dataselect