Z

Senior Software Engineer - Web Data Team

Zoominfo
10 hours ago
Full-time
Remote
Worldwide
Remote Engineering

ZoomInfo is where careers accelerate. We move fast, think boldly, and empower you to do the best work of your life. You’ll be surrounded by teammates who care deeply, challenge each other, and celebrate wins. With tools that amplify your impact and a culture that backs your ambition, you won’t just contribute. You’ll make things happen–fast.

We're looking for an exceptional Principal Software Engineer to serve as the de facto Technical Lead for our Web Data Acquisition (WDA) team - designing and leading the next generation of ZoomInfo's web crawling and data extraction infrastructure. This is a highly visible, hands-on technical leadership role where you'll own the architectural direction for crawling systems, evolve and unify crawling platforms into a best-in-class stack, and elevate a high-performing engineering team. You'll solve complex distributed systems challenges, build modular tooling that accelerates delivery, and set the standard for observability and operational excellence.
This is not a people management role. You'll have a dedicated manager handling all HR and administrative responsibilities. A product manager connects business needs with technical work. Your focus is 100% technical leadership, mentorship, and hands-on execution.

What You’ll Do

Technical Leadership & System Design 

  • Proven experience building web crawling or large-scale data systems from scratch
  • Strong architectural skills designing scalable, fault-tolerant distributed systems
  • Track record leading complex technical initiatives and driving architecture direction for teams
  • Demonstrated ability to evolve production systems incrementally while maintaining reliability

Data Engineering Expertise

  • Deep background in large-scale data engineering (terabytes daily)
  • Hands-on experience with cloud data warehouses (BigQuery, Snowflake)
  • Experience with Apache Kafka, Kubernetes (GKE/EKS), and orchestration tools (Airflow)
  • Familiarity with multi-cloud environments (GCP + AWS)
  • Expertise designing and operating ETL/ELT pipelines

Web Crawling & Data Extraction

  • Deep expertise in web crawling technologies and advanced scraping (Scrapy or similar)
  • Experience extracting structured/unstructured web data and SERP extraction
  • Knowledge of proxy infrastructure management, anti-bot detection, and ethical crawling