Python Multiprocessing with shared data source and multiple class instances -
my program needs spawn multiple instances of class, each processing data coming streaming data source.
for example:
parameters = [1, 2, 3] class fakestreamingapi: def __init__(self): pass def data(self): return 42 pass class dostuff: def __init__(self, parameter): self.parameter = parameter def run(self): data = streaming_api.data() output = self.parameter ** 2 + data # cpu intensive task print output streaming_api = fakestreamingapi() # here's how work no multiprocessing instance_1 = dostuff(parameters[0]) instance_1.run()
once instances running don't need interact each other, have data comes in. (and print error messages, etc)
i totally @ loss how make work multiprocessing, since first have create new instance of class dostuff, , have run.
this not way it:
# let's try multiprocessing import multiprocessing parameter in parameters: processes = [ multiprocessing.process(target = dostuff, args = (parameter)) ] # hmm, doesn't work...
we try defining function spawn classes, seems ugly:
import multiprocessing def spawn_classes(parameter): instance = dostuff(parameter) instance.run() parameter in parameters: processes = [ multiprocessing.process(target = spawn_classes, args = (parameter,)) ] # can't tell if works -- no output on screen?
plus, don't want have 3 different copies of api interface class running, want data shared between processes... , far can tell, multiprocessing creates copies of each new process.
ideas?
edit: think may have got it... there wrong this?
import multiprocessing parameters = [1, 2, 3] class fakestreamingapi: def __init__(self): pass def data(self): return 42 pass class worker(multiprocessing.process): def __init__(self, parameter): super(worker, self).__init__() self.parameter = parameter def run(self): data = streaming_api.data() output = self.parameter ** 2 + data # cpu intensive task print output streaming_api = fakestreamingapi() if __name__ == '__main__': jobs = [] parameter in parameters: p = worker(parameter) jobs.append(p) p.start() j in jobs: j.join()
Comments
Post a Comment