WHO WE ARE
Welcome to TELUS Digital https://www.telusdigital.com/β where innovation drives impact at a global scale. As an award-winning digital product consultancy and the digital division of TELUS https://www.telus.com/en/, one of Canadaβs largest telecommunications providers, we design and deliver transformative customer experiences through cutting-edge technology, agile thinking, and a people-first culture.
With a global team across North America, South America, Central America, Europe, Africa, and APAC, we offer end-to-end expertise across various service offerings: Web, Mobile & Digital Marketing | Enterprise AI | Customer Care AI & Technology | Enterprise Technology Modernization
From mobile apps and websites to voice UI, chatbots, AI, customer service, and in-store solutions, TELUS Digital enables seamless, trusted, and digitally powered experiences that meet customers wherever they are β all backed by the secure infrastructure and scale of our multi-billion-dollar parent company.
LOCATION & FLEXIBILITY
INDIA HYBRID: NOIDA
This is a hybrid role. This model requires the ability to work in a hybrid mode from our office in Noida/Bengaluru (1 to 2 times/ week). Our office culture is designed to foster in-person innovation, collaboration, and connection with team members, local and visiting, from other global offices.
THE OPPORTUNITY
AS A DATA ENGINEER, YOU'LL FOCUS ON SOLVING PROBLEMS AND CREATING VALUE FOR BUSINESS BY BUILDING SOLUTIONS THAT ARE RELIABLE AND SCALABLE TO WORK WITH THE SIZE AND SCOPE OF THE COMPANY. YOU WILL BE TASKED WITH CREATING A CUSTOM-BUILT PIPELINE ON GCP STACK, AND YOU WILL BE PART OF TEAMS THAT IMPLEMENT VENDOR SOURCED ENTERPRISE SOFTWARE, CONFIGURING THAT SOFTWARE, CUSTOMIZING IT, AND INTEGRATING WITH OTHER INTERNAL SYSTEMS.
RESPONSIBILITIES
- 5+ years of industry experience in software development, data engineering, business intelligence, or related field with experience in manipulating, processing, and extracting value from datasets.
- Design, build and deploy internal applications to support our technology life cycle, collaboration and spaces, service delivery management, data and business intelligence among others.
- Building Modular code for multi usable pipeline or any kind of complex Ingestion Framework used to ease the job to load the data into Datalake or Data Warehouse from multiple sources.
- Work closely with analysts and business process owners to translate business requirements into technical solutions.Coding experience in scripting and languages (Shell scripting, Python, SQL).
- Expertise in Google Cloud Platform (GCP) technologies in the data warehousing space (BigQuery, Cloud SQL, Dataflow, Data Catalog, Cloud Composer, Cloud SDK, Cloud PubSub, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion, Dataproc (good to have), BigTable).
- Maintain highest levels of development practices including: technical design, solution development, systems configuration, test docume