
Data Science Projects
Looking for freelance Data Science jobs and project work? PeoplePerHour has you covered.
Data entry: 1100 invoices (3 lines-items/invoice)
Hi I need someone to entry 1100 invoices from wavesapp.com to another cloudbase ERP system. The cloudbased software is accessible through regular web-browser. Each invoice has: - client name - location - PO number - items/lines (3 to 4 on average) - discount (if applicable) Video call, for training purposes will be done. On average, each invoice should take between 40 to 60 seconds on average, once the candidate is used to that.
7 days ago91 proposalsRemoteOctoparse Web Scraping Expert for Real Estate Data
I am looking for a highly experienced freelancer who can build a robust and repeatable web scraping framework using Octoparse (or a similar platform) to help us collect property data from three major UK listing platforms: -Rightmove.co.uk -Zoopla.co.uk -OpenRent.co.uk The scraping setup should follow specific parameters and export new listings daily to a Google Sheet, appending new data without replacing existing entries. For each property that meets the criteria, the following data should be scraped: -Website link to the listing -Name of the lettings agent -property address as on the website -Marketing price -Agent’s email address (by searching the agent's website) Filtering Criteria (to be built into the scrape): -Number of Bedrooms: **Multiple searches based on presets (e.g. 2-bed, 3-bed, 4-bed, 5+ bed), each tied to its own maximum budget cap. -Budget Ranges: **Configurable per bedroom range (to be input by us via preset criteria). -Proximity to Train/Overground Stations: **Properties must be within 1 mile of a train or overground station (use of geolocation required, not just keyword mention). -Distance from Central London: **Must fall within a maximum 20-mile radius from Central London (this should be applied strictly using each platform’s area filters or via Octoparse logic). Output Details: -Data should be exported to a Google Sheet, which appends new listings daily without overwriting or duplicating existing entries. -Manual export is fine for now, but automated export (if possible within Octoparse free version) is preferred. Deliverables: -Fully configured scraping tasks in Octoparse for all 3 platforms -Setup of multiple preset searches (based on bedroom/budget combinations) -Tutorial (video or written guide) showing how we can adjust budgets, bedroom filters, and radius in the future
12 days ago31 proposalsRemoteData migration from Slack to Google Chat
Hi, I'm looking for someone to help with the migration of data from Slack to Google Chat this month. For 25 user active users
21 days ago39 proposalsRemoteData Scientist/ML Analyst for AI Automation
I'm looking for a freelance Data Analyst or ML Specialist to help visualize and interpret client performance data through predictive dashboards. Analyze business or legal data for patterns and predictive insights Assist in building auto-reporting workflows and client analytics Background in data analysis, statistics, or machine learning Experience with dashboards or BI tools Understanding of CRM/legal/operations data Bonus: knowledge of Python or SQL (not required)
22 days ago31 proposalsRemoteopportunity
I need statistical analysis
Based on the data collected data and using SPSS Version 29, use descriptive and inferential statistical tests.
19 days ago39 proposalsRemoteSEO Data Extraction & Trend Analysis (YOY/MOM)
We are looking for someone to start right away on this project to turn around in 24 hours. Recruitment and Briefing to happen Monday 13th April. Opportunity for ongoing SEO/data analysis work if work is successful Scope of Work: 1. Data Extraction: • Pull YOY and MOM performance data from the following sources: • Google Analytics 4 (GA4) – sessions, conversions (if available), landing page performance • Google Search Console (GSC) – impressions, clicks, CTR, and average position by keyword and landing page • Google Keyword Planner / Keyword IO tool – keyword search volume • Internal Google Sheets or Excel reports – provided upon start 2. Data Merging & Processing: • Combine the following into a unified data set: • GSC keyword data (clicks, impressions, average position) • Keyword planner/IO data (monthly search volume) • Align keyword ranking and search volume by keyword • Merge page-level data from GA4 and GSC where relevant 3. Analysis: • Create clear visuals and pivot tables to analyze: • YOY traffic and ranking trends by keyword and landing page • MOM changes and performance fluctuations • Identify top-performing vs. underperforming pages and keywords • Flag drops in performance (volume, position, or CTR) and suggest possible causes 4. Reporting & Deliverables: • A consolidated Google Sheet or Excel workbook with: • Raw data tabs • Cleaned and merged data sets • Pivot tables and trend graphs • Short summary of key insights and recommendations (1–2 pages or in a separate tab) ⸻ Tools & Skills Required: • Advanced Google Sheets / Excel (incl. VLOOKUP, INDEX/MATCH, pivot tables, graphs) • Proficiency with GA4, Search Console, and Keyword Planner/IO • ideally a strong understanding of SEO metrics and how to analyze them but also can just put the data together for us to analyse • Comfortable handling large datasets and aligning multiple sources
16 days ago24 proposalsRemoteProducts catalog listing
We need a skilled freelancer to develop a comprehensive products catalog for our e-commerce store. The work involves aggregating product photos from our inventory, carefully listing all key details for each item including titles, descriptions, pricing and specifications. Metadata must be optimized for search engine visibility and converters. Specifically, the freelancer's duties are to: catalog around 100 unique products with high resolution photos; extract critical data like names, prices, sizes and enter it neatly into a spreadsheet; write separate compelling descriptions and specifications for each product focused on customer benefits; develop titles and meta descriptions using relevant keywords and tags for search; structure the catalog and data for easy navigation and sorting; proofread all information for accuracy and typos. The finalized catalog needs to showcase our products attractively while effectively communicating the value to potential buyers browsing online. Priority will be given to candidates with proven experience in catalog development, data entry and basic SEO strategies. Portfolio examples would strengthen applications.
3 days ago41 proposalsRemoteurgent
Mobile App using WP headless for data
Project Summary: Mobile Application (iOS/Android) with Headless WordPress Backend Goal: To develop a mobile application (iOS and Android assumed) utilizing a headless WordPress backend via its REST Application Programming Interface. Core Modules: The application encompasses several key functional areas, including user registration & profiles, various types of user-generated content creation and management. content discovery through swipe and list interfaces, private user-to-user messaging, integrated advertising (direct-sell and AdMob), push notifications, administrator/moderation tools, and user account settings. Detailed functionality for each module is specified in the full document linked below. Key Technical Aspects: Development will involve working with a headless WordPress backend, JSON Web Token authentication, numerous specific Custom Post Types/Taxonomies/Meta Fields, standard and custom REST Application Programming Interface endpoints, and integrations with third-party services/plugins including PublishPress Future, Firebase Cloud Messaging (via SDK and Cloud Functions), and Google AdMob SDK. A specific taxonomy caching system is already implemented on the backend. Request: Please review the full specification document and the Figma designs provided via the links below. Based on this information, provide a fixed-price quote for the development of the required mobile application (iOS and Android). Budget: Please note this project is operating under significant budget constraints. Competitive quotes focused on efficient delivery and value are strongly encouraged. Links: Full Specification Document (includes links to figma and reference documents): https://docs.google.com/document/d/1J_wcyCejzbxE4AeZIF38S9SkXEOKLeg4zL2Lhm0ZLB0/edit?usp=sharing
25 days ago73 proposalsRemoteExpires in 5Power BI Developer
Daily Code Solutions: We are actively seeking a skilled Power BI Developer to join our dynamic team . This is a fantastic opportunity for someone with a passion for data analytics, strong communication skills, and a commitment to professional growth. Responsibilities: Develop and maintain Power BI reports and dashboards. Utilize DAX, Power Query, and Data Modeling for effective data analysis. Participate in a short, open-book, technical interview with two complex problems to showcase your problem-solving abilities. Be prepared to present a Power BI report during the technical interview. Collaborate with US clients, requiring availability between 9 to 12 at night. Preferred Qualifications: 2+ years of experience as a Power BI Developer. Proficiency in DAX, Power Query, and Data Modeling. Experience working with Amazon or E-commerce data is a plus. Short notice period availability. Showcase samples of previous Power BI work Please be prepared to attend a short, open-book, technical interview with two complex problems to demonstrate your skills and problem-solving abilities.
14 days ago28 proposalsRemoteopportunity
Smart Developer Needed for AI-Powered Data Engine
Looking for a clever developer to help build out a Google Colab project that pulls in live sports data via API and generates intelligent predictions daily. The basic logic is already working. Now I want to improve it, make it smarter, and expand it to include more data, better accuracy, and automation. You must be confident in Python, APIs, and working with sports data. This is not a UI or full app build – just making the back-end logic as sharp and effective as possible.
18 days ago35 proposalsRemoteData Scraper to Collect Contact Details from Online Directories
We’re looking for a reliable data scraper to help us gather email addresses and contact details from several online directories we will provide. The collected data will be used to grow our mailing list for business outreach and marketing purposes. Project Scope: - Scrape contact details (name, email address, phone number if available, and any relevant business info) - We will provide a list of online directories for data collection - Goal: 3,000 valid and accurate contact details to start - Data to be organised and delivered in a clean, structured Excel or Google Sheets format To Apply: Please share: - Examples of similar work you've completed - Your estimated turnaround time for collecting 3,000 contacts - Any questions you may have about the project This could turn into an ongoing opportunity if the initial task is completed successfully.
18 days ago73 proposalsRemoteInteractive KPI Dashboard in Google (Monthly & Yearly View)
Brief: I need a professional, interactive KPI dashboard built in Google Sheets, using data from existing tabs. The dashboard must show monthly performance with an option to switch to yearly summaries, and pull data from: KPI tab – for actual KPI values Sales by month tab – for monthly client sales data Benchmark tab – to compare actuals against target benchmarks Requirements: Must be built in Google Sheets only Dashboard should automatically pull data from KPI, Sales by month, and Benchmark tabs Visuals to include monthly trends, yearly summaries, and benchmark comparisons Easy to navigate, clean layout, and informative Use dynamic elements like charts, conditional formatting, and slicers where suitable Should clearly show where performance is strong or underperforming Needs to be completed tomorrow or by Tuesday at the latest Please share examples of similar Google Sheets dashboards you’ve built This dashboard will be used for client reporting – it must be visually polished and very clear to read.
9 days ago16 proposalsRemoteOngoing Data Analyst for Marketing (GA4, Looker Studio, SEO)
Key Responsibilities: • Weekly GA4 Reporting & Analysis • Extract and interpret performance data • Highlight key trends, insights, and anomalies • Google Sheets: • Add visual elements (charts, graphs, trend lines) to enhance data presentation • Maintain clean, well-organized data sheets for stakeholders • Looker Studio Dashboards: • Build and update dashboards for client-facing and internal use • Ensure accuracy and easy-to-interpret visuals • Automation: • Use Supermetrics or similar tools to automate data pulls into Sheets and Looker • Identify opportunities to streamline manual reporting processes • Tracking & QA: • Conduct weekly GA4 form tracking checks to ensure conversions are firing correctly • Flag and troubleshoot discrepancies or drops in data • Monthly Reports: • Prepare marketing performance slides in Google Slides with insights, visuals, and commentary • Collaborate with account leads to align reports with client goals • Search & SEO Reporting: • Pull and interpret data from Google Search Console • Build regular reports on SEO metrics and search performance • Conduct search trend analysis to support content and SEO strategy ⸻ Ideal Candidate: • 3+ years of experience in data analysis, marketing analytics, or digital performance reporting • Strong skills in GA4, Looker Studio, Google Sheets, and Google Slides • Experience with Supermetrics or similar automation tools • Working knowledge of Google Search Console and SEO fundamentals • Highly organized and detail-oriented • Excellent communicator—able to turn data into stories • Comfortable working independently on recurring tasks ⸻ Role Details: • Freelance / part-time • Fully remote • Weekly and monthly responsibilities • Ongoing work, with potential for increased hours or responsibilities Response times and Turnaround times need to be fast for pieces of ad hoc analysis Great and friendly team to work with
16 days ago28 proposalsRemoteopportunity
WORDPRESS WEBSITE TO INCLUDE THESE FEATURES
THE BUDGET IS THE BUDGET. IT WON'T BE INCREASED AT ALL. The site will be hosted on IONOS I need someone to create a Wordpress website. The site need to have different tabs with all the usual news/features/social media links etc. It also needs to include a database of 150,000 stores which are on XL files. This data will need to be uploaded but not 1 by 1 manually. This info includes store name,address, social media links We also need a feature where stores can register their own info and sign up for a free ins-tore video. They input some data and upload a logo. So we need to be able to admin all of this data. If possible we want to be able to link the site to a data storage server/place so we can send them the videos. We also want ads on the site too which are v important We have a style and idea in mind and want to start this as soon as possible
18 days ago72 proposalsRemoteResearch for list of smartphone/pc/video game/tcg stores in UK
THE BUDGET IS FIXED AT £40 YOUR PROPOSAL IS THE BUDGET. I need to see a sample of work that I require, 2 stores please in an excel file. I need someone to research and find the names of all independent smartphone/cell/mobile phones in the UK I will give you all the areas to seatch and will help with data not to collect These are phone repair stores or phone accessories stores or just phone stores. PC stores, video games stores and TCG stores Okay so we need all the independent stores. We do not want the stores like Apple Centres or Sony or T Mobile or Orange or Vodafone etc. We want the smaller stores. So you’ll need to check. We need them all neatly put into an excel spreadsheet We also need the emails to be put into a folder. We need the store name, address, facebook page, website (if they have one) and email address. You will be provided with all the areas to search, I need to see a sample of work that I require, 2 stores please in an excel file. I need store name, address, website, email, phone number, facebook page. BUT CHECK THE DATA, CHECK CHECK CHECK. I don’t want bad data or old data or wrong data. So you search a town/cities, put it into google maps and then search the area close to the town and city. Everything is on there. Then cross reference with facebook I need store name, address, website, email, phone number, facebook page. We do NOT NEED the big stores like 3 T-Mobile, Apple, EE, Vodafone, Argos, HMV, Game, etc Only the smaller independent stores You can search under Mobile phone stores Mobile phone repair stores Mobile phone accessories stores PC computer stores TCG Trading card games stores video games stores Thank you
7 days ago19 proposalsRemoteExcel spreadsheet
Excel spreadsheet required. The business being undertaken is the emptying of chemical site toilets. Certain records have to be kept, some of which will be paper based (carbonated books). A waste transfer note is issued for each contract which runs for a year. A schedule that records the time a toilet is serviced (date, time, quantity of waste) for each waste transfer note is also required. As the waste will be stored in a tanker and only periodically taken to a waste treatment works, I need to be able to produce a record which shows all of the individual services for all of the waste transfer note records between two times & dates. I need a spreadsheet created on excel that would allow the following: Sequentially numbered pages, with each page for an individual waste transfer note record / schedule of events, with the following data points - header with individual waste transfer note detail (client, service location, contract between dates) then a running record of service events (date, time, waste out (litres), water in (litres), location) The ability to easily upload an photo image to each numbered waste transfer note / schedule record. The ability to produce a report from each waste transfer note record page that has all the data points, all the entries and totals for the litres out and in. The ability to produce a report that draws data points from all of the individual waste transfer note /schedule records that would show all of the service details undertaken between two times & dates. It would be useful if their was some way to keep a running record of the reports produced, so that when the next such report is required the start date is auto-populated as the end time/date from the previous report. Open to proposals. Would prefer UK based.
17 hours ago7 proposalsRemoteCompany Logo
I have set up a Company in Duabi called Policystream Solutions. This company will be providing Leads to Insurance Market in UK, Data migration, Data Management and Actuaraliy type work. I am looking for a logo
22 days ago139 proposalsRemoteHelp Fix CSV Import Link and Cron Job Issues
We recently moved our WordPress site, and now the link we use to share CSV import data isn’t working. Also, the scheduled tasks (cron jobs) aren’t running like they used to. We’re looking for someone who can figure out what’s going wrong and get everything working again. Thanks!
12 days ago20 proposalsRemoteBackend Dev for WhatsApp Crypto Payments (Node.js/Python)
Description: Mission-critical tasks: ✔ Implement NOWPayments API for dynamic USDT addresses (docs provided) ✔ Encrypt sensitive data (AES-256) + patch SQLi/XSS flaws ✔ Optimize Firestore for <100ms response times ✔ Handle webhook failures/retries Tech Stack:
3 days ago17 proposalsRemoteI need following tasks to be done. Python and Machine Learning.
I need a freelancer who is expert in writing original article by extracting data from the following portals. 1.The NCBI GEO database (http://www.ncbi.nlm.nih.gov/geo) 2. The original SRA file should be downloaded and processed using SRA-Toolkit (https://github.com/ncbi/sra-tools/wiki/01.-Down loading-SRA-Toolkit) and Salmon (https://salmon.readthedocs.io/en /latest/salmon.html) to generate FASTQ and gene count files for further analysis. 3.Additionally, the GSE235995 expression profiling dataset,should be sequenced using the Illumina NovaSeq 6000 (GPL24676) for Homo sapiens, was acquired from the GEO database. 4. This dataset which will be divided in to 2 groups should include five calcific aortic valve disease samples and four normal control samples. 5. Data standardization and log₂ (data+1) transformation should be performed using the Python and R package ‘DESeq2’ [10] (https://biocon ductor.org/packages/release/bioc/html/DESeq2.html), and this dataset will be used for validation.
15 days ago11 proposalsRemote