
Postgresql Projects
Looking for freelance Postgresql jobs and project work? PeoplePerHour has you covered.
opportunity
Create CMS application using Java Springboot with rest api
User Authentication & Security ● JWT-based login with optional 2FA ● Role-based access (Admin) ● Multi-tenancy structure ● CRUD APIs ● Notification service (email/SMS) APIs ● Registration Form API endpoints - 4 -5 controllers ● Store data into PostgresSQL database Security & Compliance ● Full audit logging ● Encryption (AES-256, TLS 1.2+) ● Backend unit & integration tests Database (PostgreSQL) Schema Design ● Users, Roles, Cases, Documents schema ● Document metadata storage Optimsation ● Query performance, indexing, caching Deployment (Microsoft Azure)
8 hours ago8 proposalsRemoteWeb Developer Needed , Arduino-Based Escape Rooms
Project Type: One-off project Expertise Level: Expert Industry: Entertainment / Live Events Duration: 4-8 weeks (with potential follow-up work) Remote OK: Yes Project Summary We operate physical escape rooms powered by Arduino-based puzzle boards. Each board sends a simple signal when a puzzle is completed. We need a web-based system that receives these signals, updates a session log or database, and sends the appropriate trigger to move the game forward. Also provides hints on each puzzle if needed. (e.g. unlock doors, play sound, etc.). This app will serve as the central controller for our games used by staff via tablet or desktop. What We Need Youll build a web app (dashboard + backend) that: - Accepts basic status messages from Arduino devices ('puzzle complete') - Updates a room session database in real-time - Sends commands to a main controller (e.g. via API call or MQTT) to trigger the next event - Includes a simple dashboard for staff to: - Monitor puzzle completion status - Start/reset rooms - Trigger hints or manual overrides Ideal Skills - Full-stack web development (Node.js, Express, React/Vue, or similar) - Experience integrating Arduino/microcontrollers with web servers (HTTP/WebSocket/MQTT) - Real-time communication systems - Firebase, PostgreSQL, or similar DBs design for operator dashboards (mobile-friendly) What Were Looking For - Someone who can propose the right architecture and lead development - Reliable, responsive, and capable of delivering a fast MVP - Experience with real-time systems or escape room tech is a big plus To Apply Send us: - A short message about your experience with Arduino or real-time dashboards - Examples of relevant projects - Your rate (hourly or fixed) If you're excited by live event tech and interactive experiences, wed love to hear from you!
9 days ago38 proposalsRemoteMake dating website
We seek an experienced web developer to design and build a full-stack online dating website. The website it a subscription membership. People can send gifts as well. It must have paypal and stripe. It should allow users to freely create profiles, browse profiles of other members matching their preferences, and initiate contact with other users. Key features include a registration process with identity verification, a profile builder to upload photos and detailed bios, an advanced search feature to filter profiles by location, age, and interests, and a secure messaging system. Uploading materials is required. This is not a normal site. The materials are what make the site stand out. The application stack needs to be robust yet user-friendly. Profile data including photos must be stored securely in a database. Users would log in with a unique username and password. The front-end interface should have an intuitive UI/UX design optimized for mobile usage. Performance and uptime are crucial as this will be a public-facing website. Security vulnerabilities must be addressed rigorously. The developer would be responsible for the full development lifecycle from designing the database structure and front-end components to ongoing hosting, maintenance and updates. Proficiency in technologies like Python/Django, React, PostgreSQL is essential. Experience building similar full-stack web applications is preferred. The goal is to launch an attractive, easy to use and secure dating platform to facilitate connections between users. We welcome proposals from qualified developers confident in their ability to deliver such a project on schedule and within budget.
19 days ago30 proposalsRemoteAI-Powered Medical Application
"Build a comprehensive AI-powered medical application that serves three primary user groups: Doctors, Patients, and Healthcare Providers (Hospitals & Clinics). The app should include: User Roles & Features: Doctor’s Portal: AI-assisted diagnostics and medical insights Appointment management system (view, accept, reschedule, cancel) EHR (Electronic Health Records) access with AI-powered data analysis Prescription & Treatment Plans generation with AI assistance Voice-to-text notes for easy documentation Secure video consultations & chat with patients Integration with wearable devices for real-time patient monitoring Patient App: Smart appointment booking (AI-based scheduling & reminders) AI-powered symptom checker and health recommendations Access to personal medical records & history Medication reminders & prescriptions tracking Secure telemedicine consultations with doctors AI-powered health monitoring via connected devices Emergency alert system to notify doctors/hospitals AI chatbot for medical queries Hospital & Clinic Dashboard: Centralized patient management system AI-powered analytics for patient trends & operational efficiency Automated insurance & billing processing Integration with medical devices & hospital systems (HL7, FHIR) Doctor availability tracking & scheduling optimization Emergency case prioritization & triage using AI Additional AI Features: AI-driven health predictions based on medical history AI-powered radiology & lab result analysis Secure blockchain-based data sharing between stakeholders Multilingual AI assistant for global accessibility HIPAA & GDPR compliance for data security and privacy Tech Stack & Integration: Frontend: React Native (for cross-platform mobile support) Backend: Supabase / Firebase for scalable data storage AI Engine: OpenAI / Google Health AI / TensorFlow for AI-driven diagnostics Database: PostgreSQL / NoSQL for handling EHR Third-party Integration: HL7 / FHIR API for interoperability with hospital systems Objective: The app should provide seamless, AI-powered healthcare management, making medical services more accessible, efficient, and data-driven for all users. Key Deliverables: Fully functional AI-powered app for doctors, patients, and hospitals Secure and compliant EHR system with AI-driven insights AI-enhanced telemedicine & appointment management Scalable and user-friendly UI/UX for all stakeholders
a month ago39 proposalsRemote
Past "Postgresql" Projects
opportunity
IoT data storage in PostgresQL database
Need programming to store IoT payloads in a database. The data flow is to connect an application (hosted on Things Industries) to a PostGresQL database (hosted on Aiven) using a Python script available on Github. (https://gitlab.com/shane_hull/ttn2postgresql). Already have set up the PostGres DB on Aiven and devices on Things Industries are already streaming data successfully . Please take a look FIRST at the GitLab script as these are exactly the steps I need to adapt. Python script to be hosted an AWS as a start.
AI && ML && AI chatbot && AI image processing developer
Job Title: AI RAG Chatbot implementation Job Description: We propose enhancing the functionality of our existing AI chatbot, which is currently implemented within our healthcare portal. While the chatbot performs basic tasks, it requires significant improvement to become more interactive, deliver personalized responses, and provide accurate and up-to-date information from all existing internal and external data sources. At present, the chatbot operates primarily using Selenium, FastAPI, Retrieval-Augmented Generation (RAG), Groq LLM, and ChromaDB. However, it relies exclusively on website content as its data source, which limits its ability to deliver precise and contextually relevant responses. To address this, we suggest expanding the chatbot’s capabilities to process and integrate data from multiple sources, including: Website content: Utilizing existing site information for general interactions. Two databases: PostgreSQL and MySQL containing critical information. Email content: Incorporating relevant details from email data to provide more comprehensive responses. By integrating these additional data sources, the chatbot will be better equipped to process user queries, extract meaningful insights, and deliver comprehensive and personalized interactions. Our goal is to transform the chatbot into a robust, intelligent solution capable of meeting user needs with increased precision and contextual relevance, while leveraging the latest advancements in AI and natural language processing. This enhancement initiative will ensure the chatbot remains a comprehensive and reliable tool for users, significantly improving the overall user experience on the healthcare portal.
Fix PostgreSQL Connection Issue on Apache/Passenger Server
We are looking for an experienced Linux server administrator or DevOps professional to troubleshoot and fix a PostgreSQL connection issue on our Apache/Passenger-based application. Issue Details: -Our application fails to connect to the PostgreSQL database and shows the following error: "connection to server at "172.17.0.1", port 5433 failed: Connection refused Is the server running on that host and accepting TCP/IP connections?" -The issue seems to be related to database connectivity, firewall settings, or server configuration. -The PostgreSQL database is running on port 5433, not the default 5432. -The application is deployed using Apache + Passenger. Expected Fix: -Diagnose and fix the PostgreSQL connection issue. -Ensure the application can successfully connect to the database. -Verify and update PostgreSQL, firewall, and network configurations if needed. -Restart necessary services (Apache, Passenger, PostgreSQL) after the fix. -Provide documentation on what was changed for future reference. Requirements: -Experience with Linux servers (Ubuntu or Debian preferred). -Strong knowledge of PostgreSQL configuration and troubleshooting. -Experience with Apache + Passenger. -Familiarity with Docker (if the database is running inside a container). If you are confident in fixing this issue, please apply with: -Your relevant experience -Estimated time to complete the fix -Any clarifying questions Looking forward to working with an expert who can resolve this quickly!
opportunity
Scoro API Integration with Excel via Heroku
Description: We are looking for an experienced developer to help us integrate Scoro’s API with Excel via a PostgreSQL database on Heroku. The goal is to enable dynamic data extraction from Scoro and allow real-time reporting in Excel with automated data refresh capabilities. Scope of Work: Set Up PostgreSQL Database on Heroku Configure a database to store data extracted from Scoro. Ensure secure and efficient data management. Extract & Sync Data from Scoro API Connect to Scoro’s API and retrieve relevant data (quotes, project profitability, sales performance). Automate data extraction at scheduled intervals. Connect Heroku Database to Excel Enable Excel to query and pull data directly from Heroku. Ensure data refresh with minimal manual input. Testing & Optimization Ensure data accuracy and smooth integration. Provide basic guidance on how to refresh reports and troubleshoot. Requirements: ✅ Experience with Scoro API (or similar CRM/business management APIs). ✅ Strong knowledge of PostgreSQL & Heroku deployment. ✅ Proficiency in Excel’s data connection tools (Power Query, ODBC, or API connections). ✅ Ability to write Python or Node.js scripts to automate data extraction. ✅ Good communication skills to provide documentation or guidance on usage. Project Budget: We are open to proposals, but we estimate the budget to be £500–£1000 depending on experience and project timeline. Project Timeline: Ideally, we’d like this completed within 2–3 weeks. How to Apply: Please submit a proposal outlining: Your experience with similar API integrations. A brief overview of how you’d approach this project. Any relevant projects or portfolio examples. Looking forward to working with the right expert on this project!
opportunity
Carpooling Webapp Blablacar-Style) with Laravel or Open Source
1. Core Features of a Blablacar Clone To build a similar platform, you’ll need: • User registration/login • Driver/rider matching system • Trip creation and search • Messaging system • Booking & payment (optional) • Admin dashboard • AI marketing & recommendations • Integration of international car databases 2. Tools and Platforms to Use A. Website Builders & Frameworks (Free/Open-Source) Use open-source platforms to avoid high development costs: • MobilityEngine (GitHub): A free ride-sharing platform codebase. • Sharetribe (Free trial / open-source version): Marketplace builder, can be adapted for ride-sharing. • Laravel + Vue/React: If you want a custom-built app, Laravel has many free ride-sharing starter kits. B. Admin Panel • Laravel Voyager or Laravel Nova (Free/basic tier): Admin panels for managing users, rides, payments, etc. • Forest Admin (Free for devs): Powerful admin interface. C. AI Marketing & Recommendations • ChatGPT API: For chatbots, trip recommendations, or customer service automation. • Mailchimp (Free tier): Email marketing automation. • Manychat: Free AI-based chatbot for Messenger or Instagram. D. Car Database Integration For free/affordable databases: • Open Vehicle Data (USA - NHTSA database) • European Car API (Free tier for some data) • AutoData API (Some free plans) • Caribbean, Africa, Asia – use crowd-sourced or local government transport databases (some public/open data available). 3. Hosting Platforms Affordable or free options: • Render – Free tier, good for hosting web apps. • Vercel – Free for frontend hosting. • Supabase – Free PostgreSQL database + backend functions. • Firebase – Free tier with user auth, database, hosting. 4. Optional: No-Code or Low-Code Tools If you don’t want to code much: • Bubble – No-code builder (Free tier, with plugins). • Glide – For mobile-first versions. • Tilda or Webflow for frontend UI. 5. How to Start 1. Choose your platform (e.g., Sharetribe, Laravel, or Bubble). 2. Set up user accounts, ride-posting, and matching. 3. Integrate a car database API. 4. Add AI chatbot and marketing tools. 5. Deploy to a free hosting platform. 6. Promote with free social media & SEO tools. RED FLAGS TO AVOID • They avoid giving a timeline or breakdown of how they’ll build it • No examples of previous projects • Over-promising everything under $200 (likely poor quality or scam) • No GitHub or portfolio links
Forensic SQL DBMS Auditing.
Description: I am seeking the expertise of a forensic expert to conduct a thorough audit of my SQL Database Management System (DBMS). The audit will focus on user login/logout, data read/write operations, and schema changes. The goal is to identify any unauthorized access, data tampering, or regulatory compliance issues. The ideal candidate should have a strong understanding of SQL and be familiar with various SQL DBMS (MySQL, PostgreSQL, SQL Server). Experience in forensic auditing of databases is essential, as well as knowledge of compliance regulations. The candidate should be able to detect and interpret signs of unauthorized access and data tampering. Additional information about the project can be discussed further, and I am open to suggestions on what additional information you may require from me. Insert into input field Try agai
Need to develop back end for a website
This project requires developing the backend architecture and implementation for a financial planning Software as a Service application targeting Canadian customers. The web application will help users plan for retirement and manage their finances digitally. Tech stack includes Node.js, Django or Laravel for the API backend along with a PostgreSQL database. API endpoints need to be created to handle user authentication, data retrieval and storage. Developer should have strong experience building backend systems, creating RESTful APIs, database modeling and security best practices. Looking for someone proficient in backend web development to build out the foundation for this financial SaaS.
Directus - custom API implementation
I'm looking for a experienced Directus developer to implement four APIs that integrate with an PostgreSQL or SQLite database. Two APIs (single key-value pair of JSON payload) require small custom logic. The database will contain three interconnected tables storing application data. The successful candidate will configure Directus to authenticate and authorize API requests conforming to RESTful standards. To each of theses endpoints, roles/users/CRUD can be assigned individually. Important: The API responses must be clear and unique to each error. E.g. "message": "The license key is invalid or already activated", is difficult to handle if a customer complains. Directus experience combined with clear communication and thorough work will be viewed favorably. Please provide portfolio examples showcasing similar integrations you have successfully developed.
opportunity
Looking for a senior frontend engineer
Hi there, I am looking for a senior frontend engineer skilled in Typescript, React, Node.js, Styled Component, Cypress, Postgresql 14, AWS. Then you should be well versed in API Integration (Google Map API). Also you should be familiar with turbo repo and agile scrum methodology development. So should be experienced with Jira, BitBucket, Confluence too. If you are interested in this job, don't hesitate to contact me. Thanks, Nenad.
Payment Integration
We are seeking a Payment Integration Developer to join our team and help build seamless, secure, and scalable payment solutions. The ideal candidate will have experience integrating payment gateways, ensuring transaction security, and optimizing payment flows for a smooth user experience. Responsibilities: Integrate and maintain third-party payment gateways (e.g., Stripe, PayPal, Square, Authorize.net, Braintree). Develop secure APIs and payment processing systems that support multiple payment methods (credit/debit cards, digital wallets, ACH, etc.). Optimize payment flows for performance, security, and compliance (PCI-DSS, GDPR). Troubleshoot payment-related issues and ensure high availability and reliability. Work closely with frontend and backend teams to implement seamless checkout experiences. Implement fraud detection and prevention mechanisms. Stay up to date with the latest payment technologies and industry regulations. Requirements: Strong experience with payment gateway APIs (e.g., Stripe, PayPal, Adyen, Razorpay). Proficiency in backend development using Python, Node.js, Java, or PHP. Experience with RESTful APIs, webhooks, and third-party API integrations. Knowledge of encryption, tokenization, and security best practices in payment processing. Familiarity with PCI compliance and secure transaction processing. Hands-on experience with databases (MySQL, PostgreSQL, MongoDB). Understanding of cloud platforms (AWS, GCP, or Azure) and serverless payment solutions. Experience with e-commerce platforms (Shopify, WooCommerce, Magento) is a plus. Many Thanks, Marc
opportunity
Big Data Developer /Consultant
Looking for a Big Data Developer – Telecom Analytics Software We are seeking an experienced Big Data Developer to develop a telecom analytics platform that processes Network Data, Customer Location Data, Traffic Routing, and Network Optimization. Key Responsibilities: ✅ Develop a Big Data analytics platform for real-time telecom insights ✅ Integrate network data, customer behavior, and traffic analysis tools ✅ Implement AI-driven network optimization and predictive analytics ✅ Design scalable and secure data pipelines ✅ Work with streaming technologies (Kafka, Spark, Flink, etc.) Requirements: ✔ Experience with Big Data frameworks (Hadoop, Spark, Flink) ✔ Strong Python, Scala, or Java programming skills ✔ Knowledge of telecom data sources (CDRs, location data, network logs) ✔ Experience with real-time analytics and machine learning models ✔ Familiarity with cloud platforms (AWS, GCP, Azure) and databases (MongoDB, Cassandra, PostgreSQL)
opportunity
RFP – PI Network-Based Event Prediction DApp
We are looking for an experienced blockchain developer or a small development team to build a streamlined event prediction market MVP within 4–6 weeks. The platform will leverage PI Network wallets exclusively for user deposits, withdrawals, and authentication, with backend processes executed efficiently using Polygon blockchain smart contracts. Estimated Budget: $2500 - $3,000 CAD (negotiable based on experience and proposal quality) Scope of Work The MVP must provide essential prediction market functionalities with a strong emphasis on simplicity, user-friendliness, and compliance: Core Functionalities Event Prediction Engine (Binary Outcomes) Users predict YES/NO outcomes on predefined events using PI. Events approved and managed via admin interface. PI Network Wallet Integration Seamless deposit and withdrawal via PI Network Wallet. PI transactions managed securely off-chain until settlement. Polygon Smart Contracts Simple, transparent smart contracts for outcome verification and PI settlement. Backend covers blockchain gas fees (no user-facing MATIC fees). Admin Dashboard Interface for managing events, user disputes, and monitoring transactions. Admin approval required for event creation to ensure compliance. Automated Market Settlement Settlement initiated automatically based on user consensus. Admin intervention for resolution disputes. Technical Requirements Blockchain & Wallets: PI Network Wallet SDK, Polygon Blockchain (Solidity), Web3.js/Ethers.js Backend: Node.js or Python (FastAPI/Django), PostgreSQL/MongoDB Frontend: React.js/Vue.js compatible with PI Browser Open Source Usage: Preference for legally licensed or open-source libraries Compliance and Regulation The platform must adhere strictly to all applicable regulations and best practices regarding blockchain and prediction markets. Explicit gambling terminology is to be avoided; use "event prediction" or "forecasting" instead. Compensation & Partnership Options We offer flexible compensation structures: Milestone-based Payments Partial or full compensation in PI (optional) Opportunity for long-term partnership with a potential revenue-sharing model We seek practical, efficient, and regulation-compliant solutions with an open and collaborative development approach. Interested developers or teams are encouraged to propose their preferred working arrangements and provide examples of relevant previous projects.
Create chatbot for hospital staff using LLM's and Data Pipeline
This proof-of-concept aims to develop a conversational agent to assist hospital staff. The selected individual will build a data pipeline in Azure Data Lake to organize relevant information from an existing PostgreSQL database into a machine learning format. A natural language model will then be trained on this dataset to power a chatbot with GPT-like abilities. The intelligence behind the agent will come from the constructed machine learning model. An API must also be developed connecting the trained model to the chatbot interface. This prototype seeks to demonstrate the value of a dialogue solution for clients by deriving insights from structured data and responding to queries similar to human conversation. Applicants should possess skills in data engineering, machine learning, software development and designing applied AI systems.
opportunity
Fully Automated EPOS Sales Report Generator - READ DESCRIPTION
Job Overview: We need a custom EPOS report generation tool that allows us to input total drink sales and generates a realistic, itemized sales report with timestamps spread across the event day. The system should then export this data into a professional PDF report. Key Responsibilities: Develop a data entry system where users input total sales per drink. Build an algorithm to distribute sales over a set time period (randomized but realistic). Implement a reporting dashboard to view/edit generated reports. Create a PDF export function that formats the data into a professional sales report. Optimize for speed and efficiency, handling thousands of transactions. Ensure a user-friendly interface for non-technical users. Technical Requirements: Backend: Python (Flask/Django), Node.js, or PHP (Laravel) Database: MySQL, PostgreSQL, or Firebase Frontend: React.js, Vue.js, or a simple dashboard UI PDF Generation: ReportLab (Python), jsPDF (JavaScript), TCPDF (PHP) Hosting: AWS, Firebase, or DigitalOcean Ideal Candidate: Experience in POS systems, data automation, or reporting tools. Strong skills in backend development & database management. Knowledge of PDF generation and formatting. Ability to create an intuitive user interface. Understanding of sales trends and event-based data analysis is a plus.
GCP Dataflow expert
We are looking for an experienced Google Cloud Platform (GCP) Dataflow expert to help us build streaming and batch data pipelines to ingest, transform and load data into our BigQuery data warehouse. The successful candidate should have extensive hands on experience designing, developing and optimizing Dataflow pipelines that ingest data from Pub/Sub and other sources for both real-time and batch processing use cases. They should be proficient in Java/Python and have a deep understanding of Dataflow concepts like windowing, triggers, side inputs etc. Experience designing and architecting scalable data warehousing solutions on BigQuery is essential. The pipelines need to support ingesting millions of records per day from various APIs and services. Experience integrating Dataflow with other GCP services like Cloud Functions, Datastore, Storage etc is preferred. This is a short term contracting role where you will work closely with our in-house team of developers and data scientists. By leveraging your strong Dataflow expertise, you will help us build a robust data ingestion and ETL layer to serve the analytics and reporting needs of our fast growing SaaS platform. Experience with PostgreSQL and Snowflake is a plus. The ideal candidate should have at least 3 years of relevant hands on experience developing complex Dataflow pipelines and data warehousing solutions on Google Cloud Platform. proficiency with Programming languages such as Java/Python is mandatory.
opportunity
Automated Data Processing Portal
Developer Job Specification: Automated Data Processing Portal for Warm & Inflyte Overview: We are looking for an experienced backend developer (preferably with API integration and automation experience) to build a custom platform/portal that automates our current manual data processing workflow between Warm Radio Airplay Tracking and Inflyte DJ Feedback. We currently download: this from Warm Which is plays on radio: https://www.dropbox.com/scl/fi/hwyb5rwx6bwyxfvmptkiz/WARM-Report-for-Mau-P-All-Songs-2024-12-13_2025-03-13.xlsx?rlkey=yf2a9tqooyi4fgvk2t4c6xejk&e=1&dl=0 and this from Inflyte which is DJ Feedback: https://www.dropbox.com/scl/fi/xi3d5fc3jkf4txtyiopjc/Inflyte_stats_189484-4.docx?rlkey=4oiagwg19beuo9lcmjtyk13lr&e=1&dl=0 Currently, we download data from both platforms manually, reformat it in Excel, and update a Google Sheet weekly. Example: https://www.dropbox.com/scl/fi/q1uc6iqcf3jqa2xuwxs3k/Mau-P-The-Less-I-know-The-Better-Radio-Club-Report-wc-3_3-NERVOUS.xlsx?rlkey=234f890e9799b4pv7n1z3llic&e=1&dl=0 We need a streamlined automated solution to handle this process. Project Requirements 1. Core Functionality • API Integration (https://www.dropbox.com/scl/fi/nyfhnldl6g3qn0jhiowc8/Inflyte-Customer-API-v1.5.pdf?rlkey=k52sdcygmt3uiv2u7fy6cj968&e=1&dl=0) ◦ Connect to Warm API (Radio Airplay Tracking) - https://www.warmmusic.net/ ◦ Connect to Inflyte API (DJ Feedback) - https://inflyteapp.com/ ◦ Extract relevant data from both sources • Data Processing & Transformation ◦ Normalise, clean, and format the extracted data to match our required structure ◦ Merge both data sources appropriately • Google Sheets Integration ◦ Append new weekly data to an existing Google Sheet ◦ Move previous week's data to the bottom of the sheet ◦ Ensure easy manual editing when required 2. Additional Features (Nice-to-Have) • Automated Weekly Scheduler ◦ Set up an automatic weekly data update ◦ Option for manual trigger (button inside a simple web UI) • Basic User Interface (Optional but preferred) ◦ Secure login for internal team ◦ Dashboard to view processed data logs & allow manual refresh • Error Handling & Logging ◦ Notify us via email/Slack if the automation fails Tech Stack Preference (Flexible) • Backend: Python (Flask/Django) or Node.js (Express) • Database: Google Sheets API (or PostgreSQL if needed) • Hosting: AWS Lambda, Google Cloud Functions, or Heroku • Frontend (Optional): React/Next.js (for basic UI) Ideal Candidate Skills • Experience working with API integrations (RESTful APIs, JSON) • Proficiency in Python or Node.js for automation • Google Sheets API knowledge • Cloud hosting experience (AWS/GCP/Heroku) • Strong problem-solving and debugging skills How to Apply To be considered, please send a proposal including: • Relevant past experience with API automation & Google Sheets • Your suggested approach for building the system • Any portfolio/examples of similar work Looking forward to working with a skilled developer who can streamline our process and save us valuable time!