Multi Threaded Python scraper optimization project
$30-250 USD
Paguhet në dorëzim
I have a proxy supported python scraper hosted on Heroku using Django framework that does the following:
1) user input a list of keywords
2) scraper goes to [login to view URL] and searches for each keyword and returns product information for the first page of search results for each keyword (usually 60 products per keyword). Product information scraped includes title, description, features, review rating, price, seller, etc. It also use [login to view URL] api to return google keyword planner search volume data for each search term.
3) when complete it exports the data to a spreadsheet
The scraper is fully functional and no issues there.
The issue i am having is my scraper is using too much memory and will abruptly stop with it hits the usage limit on my heroku account (1GB). I am looking for someone with experience troubleshooting and optimizing python scraping code that could make my scraper run more efficiently. There may be a smoking gun that is the main cause of the memory leak.. im not sure. See attached image of the heroku metrics page showing the memory spike.
If it cannot be optimized further than it already is, my last resort I can think of is to restrict the number of scraped results for a given keyword (for example 25 scraped product results per keyword) so the export file doesn't return 60 results per page, thereby reducing the time it needs to run and memory used. This would also require altering the code that restricts the number of keywords the user can search on each run.
The deliverables for this project include:
1) optimizing the source code (currently on Heroku) so i can scrape a minimum of 10 keywords & 25 results/keyword per run without the scraper stopping.
2) I also have a couple really simple tweaks i need made to fix a couple fields in the export data.
I'm hoping to find someone with strong experience with python scraping, Django, and proxies that can fix my problem simply and quickly. I have a lot of future work possibilities for someone highly skilled in python web scraping. Please message me with further questions.
ID Projekti: #19700691
Rreth projektit
19 profesionistë freelancer dërguan një ofertë mesatare prej $199 për këtë punë
Hi There, I can do it very quickly & effectively. I'm having more than 18 years of web development experience. Looking forward to work with you! Thanks!
Hello. I 'm expert in "Django, Python, Software Architecture, Web Scraping" and I have working for 7+ years in this field. I 'm very interest to your project. I have checked your project description carefully and i Më shumë
Hi, I am an ex-Microsoft employee, an expert web scraper and an experienced all-round Python/Django developer. I am working with e small team of talented developers and am confident of optimizing your Python scrapin Më shumë
(if you choose me please atleast send "HI" message, if you invite the project i can't respond you, this is freelancer rules. So its my request) Hi there, I have scraped amazon, aliexpress, yellow pages,yelp,zoma Më shumë
Hello, Hope you are doing well, I have in-depth technical knowledge on Python, Flask, Node.js, JavaScript, PostgreSQL Administration, Django, MySQL, MongoDB, XML-RPC services, stores, ERP/CRM portals and implement Më shumë
Dear, Sir. Nice to meet you. I read your project description carefully and am very interested in working for your project. I am able to provide the best product with awesome and good performance and offer a good resu Më shumë
Hi, I looked at your description carefully and I am very interested in providing my skills in hopes of working with you. I have ample experience in web scraping. -scrap any website using selenium, beautifulSoup, reques Më shumë
Hi there . I am a Python expert, and after reading your requirements caredully, I am sure I can write an Python application for you thatbcan scrape the Amazon sites
I don't know which scraping technique the previous developer has used. You have to show us the code. Then we can bring you ideas to optimize your code.