python - Should i use multi-threading? (retrieving mass data from APIs) -


i have python script gathers 10,000's of 'people' api , goes on request 2 other apis gather further data them , save information local database, takes around 0.9 seconds per person.

so @ moment take long time complete. multi-threading speed up? tried multi-threading test locally , slower, test simple function without api interaction or web/disk related.

thanks

how many cores have?

how parallelizable process?

is problem cpu bound?

if have several cores , it's parallelizable across them, you're speed boost. overhead multithreading isn't 100% unless implemented awfully, that's plus.

on other hand, if slow part cpu bound might lot more fruitful c extension or cython. both of @ times can give 100× speedup (sometimes more, less, depending on how numeric code is) less effort 2× speed-up naïve usage of multiprocessing. 100× speedup translated code.

but, seriously, profile. chances there low hanging fruit easier access of this. try line profiler (say, 1 called line_profiler [also called kernprof]) , builtin cprofile.


Comments

Popular posts from this blog

c# - How Configure Devart dotConnect for SQLite Code First? -

java - Copying object fields -

c++ - Clear the memory after returning a vector in a function -