Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
12.0 - 15.0 years
35 - 60 Lacs
Chennai, Bengaluru
Hybrid
Job Description: Job Title: GCP Solution Architect Location : Chennai | Bangalore Experience : 12-15 years in IT Key Responsibilities Architect and lead GCP-native data and AI solutions tailored to AdTech use casessuch as real-time bidding, campaign analytics, customer segmentation, and look alike modeling. Design high-throughput data pipelines, audience data lakes, and analytics platforms leveraging GCP services like BigQuery, Dataflow, Pub/Sub, Cloud Storage, Vertex AI, etc. Collaborate with ad operations, marketing teams, and digital product owners to understand business goals and translate them into scalable and performant solutions. Integrate with third-party AdTech and MarTech platforms, including DSPs, SSPs, CDPs, DMPs, ad exchanges, and identity resolution systems. Ensure architectural alignment with data privacy regulations (GDPR, CCPA) and support consent management and data anonymization strategies. Drive technical leadership across multi-disciplinary teams (Data Engineering, MLOps, Analytics) and enforce best practices in data governance, model deployment, and cloud optimization. Lead discovery workshops, solution assessments, and architecture reviews during pre-sales and delivery cycles. Required Skills & Qualifications Bachelors or Masters degree in Computer Science, Engineering, or related field. BigQuery, Cloud Pub/Sub, Dataflow, Dataproc, Cloud Composer (Airflow), Vertex AI, AI Platform, AutoML, Cloud Functions, Cloud Run, Looker, Apigee, Dataplex, GKE Deep understanding of programmatic advertising (RTB, OpenRTB), cookie-less identity frameworks, and AdTech/MarTech data flows. Experience integrating or building components like: Data Management Platforms (DMPs) Customer Data Platforms (CDPs) Demand-Side Platforms (DSPs) Ad servers, attribution engines, and real-time bidding pipelines Event-driven and microservices architecture using APIs, streaming pipelines, and edge delivery networks. Integration with platforms like Google Marketing Platform, Google Ads Data Hub, Snowplow, Segment, or similar. Strong understanding of IAM, data encryption, PII anonymization, and regulatory compliance (GDPR, CCPA, HIPAA if applicable). Experience with CI/CD pipelines (Cloud Build), Infrastructure as Code (Terraform), and MLOps pipelines using Vertex AI or Kubeflow. Strong experience in Python and SQL; familiarity with Scala or Java is a plus. Experience with version control (Git), Agile delivery, and architectural documentation tools. If you know someone suitable, feel free to forward their resume to aarthi.murali@zucisystems.com. Regards, Aarthi Murali
Posted 1 week ago
7.0 - 12.0 years
0 Lacs
Pune, Chennai, Bengaluru
Hybrid
Description: Basic Qualifications: Exp - 7 to 10years 7+ years of experience developing in Python 3+ years of cloud development experience with GCP i.e. Pub Sub, cloud run, big query etc. 1+ year of experience scripting in bash or PowerShell 4+ years of networking experience (security, DNS, VPN, Cloud, load balancing) 4+ years of systems administration experience with at least one operating system (Linux or Windows) Desired Experience & Skills: 3+ years of CI/CD experience 2+ years of serverless or container-based architecture experience 2+ years of Infrastructure as code (IAC) experience Can autonomously contribute to cloud and application orchestration code and be actively involved in peer reviews Can deploy and manage the common tools we use (Jenkins, monitoring, logging, SCM, etc) from code Networking (tcpdump, network flow security analysis, can collect and understand metric between microservices) Experience with authentication technologies (federated auth, SSO) Experience in Agile practices
Posted 1 week ago
4.0 - 9.0 years
20 - 35 Lacs
Gurugram
Work from Office
Job Description - The candidate should have extensive production experience (2+ Years ) in GCP - Strong background in Data engineering 2-3 Years of exp in Big Data technologies including, Hadoop, NoSQL, Spark, Kafka etc. - Exposure to enterprise application development is a must. Roles & Responsibilities 4-10 years of IT experience range is preferred. Able to effectively use GCP managed services e.g. Dataproc, Dataflow, pub/sub, Cloud functions, Big Query, GCS - At least 4 of these Services. Good to have knowledge on Cloud Composer, Cloud SQL, Big Table, Cloud Function. Strong experience in Big Data technologies Hadoop, Sqoop, Hive and Spark including DevOPs. Good hands on expertise on either Python or Java programming. Good Understanding of GCP core services like Google cloud storage, Google compute engine, Cloud SQL, Cloud IAM. Good to have knowledge on GCP services like App engine, GKE, Cloud Run, Cloud Built, Anthos. Ability to drive the deployment of the customers workloads into GCP and provide guidance, cloud adoption model, service integrations, appropriate recommendations to overcome blockers and technical road-maps for GCP cloud implementations. Experience with technical solutions based on industry standards using GCP - IaaS, PaaS and SaaS capabilities. Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technologies. Act as a subject-matter expert OR developer around GCP and become a trusted advisor to multiple teams. Technical ability to become certified in required GCP technical certifications.
Posted 1 week ago
5.0 - 10.0 years
25 - 35 Lacs
Noida, Pune, Bengaluru
Work from Office
Description: We are seeking a proficient Data Governance Engineer to lead the development and management of robust data governance frameworks on Google Cloud Platform (GCP). The ideal candidate will bring in-depth expertise in data management, metadata frameworks, compliance, and security within cloud environments to ensure high-quality, secure, and compliant data practices aligned with organizational goals. Requirements: 4+ years of experience in data governance, data management, or data security. Hands-on experience with Google Cloud Platform (GCP) including BigQuery, Dataflow, Dataproc, and Google Data Catalog. Strong command over metadata management, data lineage, and data quality tools (e.g., Collibra, Informatica). Deep understanding of data privacy laws and compliance frameworks. Proficiency in SQL and Python for governance automation. Experience with RBAC, encryption, and data masking techniques. Familiarity with ETL/ELT pipelines and data warehouse architectures. Job Responsibilities: Develop and implement comprehensive data governance frameworks , focusing on metadata management, lineage tracking , and data quality. Define, document, and enforce data governance policies, access control mechanisms, and security standards using GCP-native services such as IAM, DLP, and KMS. Manage metadata repositories using tools like Collibra, Informatica, Alation, or Google Data Catalog. Collaborate with data engineering and analytics teams to ensure compliance with GDPR, CCPA, SOC 2, and other regulatory standards. Automate processes for data classification, monitoring, and reporting using Python and SQL. Support data stewardship initiatives including the development of data dictionaries and governance documentation. Optimize ETL/ELT pipelines and data workflows to meet governance best practices. What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!
Posted 1 week ago
8.0 - 13.0 years
3 - 7 Lacs
Mumbai, Maharashtra
Work from Office
The ideal candidate must possess strong communication skills, with an ability to listen, comprehend information, and share it with all the key stakeholders, highlighting opportunities for improvement and concerns, if any. He/she must be able to work collaboratively with teams to execute tasks within defined timeframes while maintaining high-quality standards and superior service levels. The ability to take proactive actions and willingness to take up responsibility beyond the assigned work area is a plus. Business Intelligence Specialist with 8 years of progressive experience in driving marketing & campaign performance analytics. Adept at developing and managing BI dashboards, automating reporting frameworks and delivering actionable insights to optimize campaign ROI, engagement and conversions. Senior Analyst Roles and responsibilities: Dashboarding & ReportingBuilt and maintained real-time dashboards and automated performance reports using Tableau / Power BI & SQL to ensure timely insights delivery. Performance Analytics & Insights Generation:Conducted funnel analysis, customer behavior modeling and trend identification to generate insights that enhanced marketing strategies. Cross-functional CollaborationWorked with marketing, product and data teams to align on KPIs and translate business needs into analytical solutions and presentations. Technical and Functional Skills: BI ToolsTableau / Power BI, LookerLanguagesSQL (Postgre SQL, MySQL), Python (Pandas, Matplotlib) / RData PlatformsGoogle BigQuery/ Snowflake / AWS RedshiftMarketing ToolsGoogle Analytics / Adobe Analytics / Salesforce Marketing Cloud / Adobe Analytics
Posted 1 week ago
4.0 - 7.0 years
8 - 14 Lacs
Pune
Hybrid
Job Description We are Hiring for ETL Engineer with GCP Location: India (Pune) Exp: 3 - 7 Years Required Skills and Qualifications: 3+ years of experience in Data Engineering roles. Strong hands-on experience with Google Cloud Platform (GCP) data services, specifically BigQuery, Cloud Composer (Apache Airflow), and Cloud Storage. Mandatory expertise in informatica. Mandatory strong proficiency in SQL and PL/SQL for data manipulation, stored procedures, functions, and complex query writing. Ability to optimize BigQuery queries for performance and cost. Familiarity with version control systems (e.g., Git). Excellent problem-solving, analytical, and communication skills. Ability to work independently and collaboratively in an agile environment. Bachelor's degree in Computer Science, Engineering, or a related field.
Posted 1 week ago
5.0 - 9.0 years
9 - 13 Lacs
Pune
Work from Office
Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients acrossbanking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. MAKE AN IMPACT Big Data Tester LocationPune (for Mastercard) Experience Level5-9 years Minimum Skill Set Required / Must Have Python PySpark Testing skills and best practices for data validation SQL (hands-on experience, especially with complex queries) and ETL Good to Have Unix Big Data: Hadoop, Spark, Kafka, NoSQL databases (MongoDB, Cassandra), Hive, etc. Data Warehouse: TraditionalOracle, Teradata, SQL Server Modern CloudAmazon Redshift, Google BigQuery, Snowflake AWS development experience (not mandatory, but beneficial) Best Fit Python + PySpark + Testing + SQL (hands-on) and ETL + Good to Have skills
Posted 1 week ago
7.0 - 9.0 years
11 - 16 Lacs
Bengaluru
Work from Office
Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities Knowledge of design principles and fundamentals of architecture Understanding of performance engineering Knowledge of quality processes and estimation techniques Basic understanding of project domain Ability to translate functional / nonfunctional requirements to systems requirements Ability to design and code complex programs Ability to write test cases and scenarios based on the specifications Good understanding of SDLC and agile methodologies Awareness of latest technologies and trends Logical thinking and problem-solving skills along with an ability to collaborate Technical and Professional Requirements: Technology-Cloud Platform-GCP Database-Google BigQuery Preferred Skills: Technology-Cloud Security-GCP - Infrastructure Security Technology-Cloud Platform-GCP Database
Posted 1 week ago
2.0 - 5.0 years
13 - 17 Lacs
Bengaluru
Work from Office
A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. In your role, you will be responsible for Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred technical and professional experience Experience with AEM Core Technologies OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence
Posted 1 week ago
3.0 - 6.0 years
10 - 14 Lacs
Gurugram
Work from Office
As an entry level Application Developer at IBM, you'll work with clients to co-create solutions to major real-world challenges by using best practice technologies, tools, techniques, and products to translate system requirements into the design and development of customized systems. In your role, you may be responsible for: Working across the entire system architectureto design, develop, and support high quality, scalable products and interfaces for our clients Collaborate with cross-functional teams to understand requirements and define technical specifications for generative AI projects Employing IBM's Design Thinking to create products that provide a great user experience along with high performance, security, quality, and stability Working with a variety of relational databases (SQL, Postgres, DB2, MongoDB), operating systems (Linux, Windows, iOS, Android), and modern UI frameworks (Backbone.js, AngularJS, React, Ember.js, Bootstrap, and JQuery) Creating everything from mockups and UI components to algorithms and data structures as you deliver a viable product Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise SQL authoring, query, and cost optimisation primarily on Big Query. Python as an object-oriented scripting language. Data pipeline, data streaming and workflow management toolsDataflow, Pub Sub, Hadoop, spark-streaming Version control systemGIT & Preferable knowledge of Infrastructure as CodeTerraform. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience performing root cause analysis on internal and external data and processes to answer specific business questions Preferred technical and professional experience Experience building and optimising data pipelines, architectures and data sets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Working knowledge of message queuing, stream processing, and highly scalable data stores. Experience supporting and working with cross-functional teams in a dynamic environment. We are looking for a candidate with experience in a Data Engineer role, who are also familiar with Google Cloud Platform.
Posted 1 week ago
5.0 - 7.0 years
0 - 1 Lacs
Vadodara
Work from Office
Job Overview: We are looking for a Senior Data Analyst with a strong background in data engineering and analytics using DBT, GCP, SQL, Python, and Looker Studio , and proven experience with AI-powered conversational analytics (e.g., Gemini , ChatGPT ). Manage and enhance our data infrastructure, create impactful dashboards, and contribute to innovative projects integrating AI with analytics. Role & responsibilities: Maintain and optimize data pipelines and models using DBT on Google Cloud Platform (GCP) Develop and manage data transformations in BigQuery Create, update, and maintain dashboards in Looker Studio Write advanced SQL queries and use Python for data wrangling, automation, and API interactions Clean, validate, and standardize large and complex datasets Collaborate with business stakeholders to translate analytical needs into technical solutions Work on AI and LLM-based analytics projects using tools like Gemini or ChatGPT Document data models, processes, and pipeline logic Implement data quality checks and ensure consistent, reliable outputs Required Skills & Experience 5+ years of hands-on experience with: SQL (complex queries, optimization, window functions) Python (data manipulation, automation scripts, API usage) DBT (data modeling, testing, documentation) Google Cloud Platform (especially BigQuery ) Looker Studio (dashboard design, data visualization best practices) Preferred with: ETL/ELT tools such as Airbyte , Stitch Reverse ETL workflows Digital marketing data workflows (e.g., Google Analytics , Google Tag Manager ) Conversational analytics using LLMs (e.g., Gemini, ChatGPT) Prompt engineering and building interfaces between AI and data Soft Skills: Strong analytical and problem-solving skills Clear and effective communicator with both technical and non-technical teams Detail-oriented, reliable, and autonomous Able to manage multiple priorities and meet deadlines
Posted 1 week ago
4.0 - 7.0 years
18 - 20 Lacs
Pune
Hybrid
Job Title: GCP Data Engineer Location: Pune, India Experience: 4 to 7 Years Job Type: Full-Time Job Summary: We are looking for a highly skilled GCP Data Engineer with 4 to 7 years of experience to join our data engineering team in Pune . The ideal candidate should have strong experience working with Google Cloud Platform (GCP) , including Dataproc , Cloud Composer (Apache Airflow) , and must be proficient in Python , SQL , and Apache Spark . The role involves designing, building, and optimizing data pipelines and workflows to support enterprise-grade analytics and data science initiatives. Key Responsibilities: Design and implement scalable and efficient data pipelines on GCP , leveraging Dataproc , BigQuery , Cloud Storage , and Pub/Sub. Develop and manage ETL/ELT workflows using Apache Spark , SQL , and Python. Orchestrate and automate data workflows using Cloud Composer (Apache Airflow). Build batch and streaming data processing jobs that integrate data from various structured and unstructured sources. Optimize pipeline performance and ensure cost-effective data processing. Collaborate with data analysts, scientists, and business teams to understand data requirements and deliver high-quality solutions. Implement and monitor data quality checks, validation, and transformation logic. Required Skills: Strong hands-on experience with Google Cloud Platform (GCP) Proficiency with Dataproc for big data processing and Apache Spark Expertise in Python and SQL for data manipulation and scripting Experience with Cloud Composer / Apache Airflow for workflow orchestration Knowledge of data modeling, warehousing, and pipeline best practices Solid understanding of ETL/ELT architecture and implementation Strong troubleshooting and problem-solving skills Preferred Qualifications: GCP Data Engineer or Cloud Architect Certification. Familiarity with BigQuery , Dataflow , and Pub/Sub. Interested candidates can send your your resume on pranitathapa@onixnet.com
Posted 1 week ago
7.0 - 10.0 years
20 - 27 Lacs
Noida
Work from Office
Job Responsibilities: Technical Leadership: • Provide technical leadership and mentorship to a team of data engineers. • Design, architect, and implement highly scalable, resilient, and performant data pipelines, using GCP technologies is a plus (e.g., Dataproc, Cloud Composer, Pub/Sub, BigQuery). • Guide the team in adopting best practices for data engineering, including CI/CD, infrastructure-as-code, and automated testing. • Conduct code reviews, design reviews, and provide constructive feedback to team members. • Stay up-to-date with the latest technologies and trends in data engineering, Data Pipeline Development: • Develop and maintain robust and efficient data pipelines to ingest, process, and transform large volumes of structured and unstructured data from various sources. • Implement data quality checks and monitoring systems to ensure data accuracy and integrity. • Collaborate with cross functional teams, and business stakeholders to understand data requirements and deliver data solutions that meet their needs. Platform Building & Maintenance: • Design and implement secure and scalable data storage solutions • Manage and optimize cloud infrastructure costs related to data engineering workloads. • Contribute to the development and maintenance of data engineering tooling and infrastructure to improve team productivity and efficiency. Collaboration & Communication: • Effectively communicate technical designs and concepts to both technical and non-technical audiences. • Collaborate effectively with other engineering teams, product managers, and business stakeholders. • Contribute to knowledge sharing within the team and across the organization. Required Qualifications: • Bachelor's or Master's degree in Computer Science, Engineering, or a related field. • 7+ years of experience in data engineering and Software Development. • 7+ years of experience of coding in SQL and Python/Java. • 3+ years of hands-on experience building and managing data pipelines in cloud environment like GCP. • Strong programming skills in Python or Java, with experience in developing data-intensive applications. • Expertise in SQL and data modeling techniques for both transactional and analytical workloads. • Experience with CI/CD pipelines and automated testing frameworks. • Excellent communication, interpersonal, and problem-solving skills. • Experience leading or mentoring a team of engineers Roles and Responsibilities Job Responsibilities: Technical Leadership: • Provide technical leadership and mentorship to a team of data engineers. • Design, architect, and implement highly scalable, resilient, and performant data pipelines, using GCP technologies is a plus (e.g., Dataproc, Cloud Composer, Pub/Sub, BigQuery). • Guide the team in adopting best practices for data engineering, including CI/CD, infrastructure-as-code, and automated testing. • Conduct code reviews, design reviews, and provide constructive feedback to team members. • Stay up-to-date with the latest technologies and trends in data engineering, Data Pipeline Development: • Develop and maintain robust and efficient data pipelines to ingest, process, and transform large volumes of structured and unstructured data from various sources. • Implement data quality checks and monitoring systems to ensure data accuracy and integrity. • Collaborate with cross functional teams, and business stakeholders to understand data requirements and deliver data solutions that meet their needs. Platform Building & Maintenance: • Design and implement secure and scalable data storage solutions • Manage and optimize cloud infrastructure costs related to data engineering workloads. • Contribute to the development and maintenance of data engineering tooling and infrastructure to improve team productivity and efficiency. Collaboration & Communication: • Effectively communicate technical designs and concepts to both technical and non-technical audiences. • Collaborate effectively with other engineering teams, product managers, and business stakeholders. • Contribute to knowledge sharing within the team and across the organization. Required Qualifications: • Bachelor's or Master's degree in Computer Science, Engineering, or a related field. • 7+ years of experience in data engineering and Software Development. • 7+ years of experience of coding in SQL and Python/Java. • 3+ years of hands-on experience building and managing data pipelines in cloud environment like GCP. • Strong programming skills in Python or Java, with experience in developing data-intensive applications. • Expertise in SQL and data modeling techniques for both transactional and analytical workloads. • Experience with CI/CD pipelines and automated testing frameworks. • Excellent communication, interpersonal, and problem-solving skills. • Experience leading or mentoring a team of engineers
Posted 1 week ago
4.0 - 8.0 years
20 - 35 Lacs
Pune, Gurugram, Bengaluru
Hybrid
Salary: 20 to 35 LPA Exp: 3 to 7 years Location: Gurgaon/Pune/Bengalore Notice: Immediate to 30 days..!! Job Profile: Experienced Data Engineer with a strong foundation in designing, building, and maintaining scalable data pipelines and architectures. Skilled in transforming raw data into clean, structured formats for analytics and business intelligence. Proficient in modern data tools and technologies such as SQL, T-SQL, Python, Databricks, and cloud platforms (Azure). Adept at data wrangling, modeling, ETL/ELT development, and ensuring data quality, integrity, and security. Collaborative team player with a track record of enabling data-driven decision-making across business units. As a Data engineer, Candidate will work on the assignments for one of our Utilities clients. Collaborating with cross-functional teams and stakeholders involves gathering data requirements, aligning business goals, and translating them into scalable data solutions. The role includes working closely with data analysts, scientists, and business users to understand needs, designing robust data pipelines, and ensuring data is accessible, reliable, and well-documented. Regular communication, iterative feedback, and joint problem-solving are key to delivering high-impact, data-driven outcomes that support organizational objectives. This position requires a proven track record of transforming processes, driving customer value, cost savings with experience in running end-to-end analytics for large-scale organizations. Design, build, and maintain scalable data pipelines to support analytics, reporting, and advanced modeling needs. Collaborate with consultants, analysts, and clients to understand data requirements and translate them into effective data solutions. Ensure data accuracy, quality, and integrity through validation, cleansing, and transformation processes. Develop and optimize data models, ETL workflows, and database architectures across cloud and on-premises environments. Support data-driven decision-making by delivering reliable, well-structured datasets and enabling self-service analytics. Provides seamless integration with cloud platforms (Azure), making it easy to build and deploy end-to-end data pipelines in the cloud Scalable clusters for handling large datasets and complex computations in Databricks, optimizing performance and cost management. Must to have Client Engagement Experience and collaboration with cross-functional teams Data Engineering background in Databricks Capable of working effectively as an individual contributor or in collaborative team environments Effective communication and thought leadership with proven record. Candidate Profile: Bachelors/masters degree in economics, mathematics, computer science/engineering, operations research or related analytics areas 3+ years experience must be in Data engineering. Hands on experience on SQL, Python, Databricks, cloud Platform like Azure etc. Prior experience in managing and delivering end to end projects Outstanding written and verbal communication skills Able to work in fast pace continuously evolving environment and ready to take up uphill challenges Is able to understand cross cultural differences and can work with clients across the globe.
Posted 1 week ago
4.0 - 9.0 years
6 - 16 Lacs
Panaji, Pune
Work from Office
Join us as a Node.js Developer to design and build scalable backend services, APIs, and microservices. Collaborate across teams, optimize performance, and deploy to cloud. Vue.js & PHP (Laravel) experience a plus. Based in Pune or Goa.
Posted 1 week ago
3.0 - 7.0 years
13 - 17 Lacs
Bengaluru
Work from Office
Skill required: Tech for Operations - Product Development Management Designation: AI/ML Computational Science Manager Qualifications: Any Graduation Years of Experience: Experienced What would you do You will be part of the Technology for Operations team that acts as a trusted advisor and partner to Accenture Operations. The team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. We work closely with the sales, offering and delivery teams to identify and build innovative solutions.The Tech For Operations (TFO) team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. Works closely with the sales, offering and delivery teams to identify and build innovative solutions. Major sub deals include AHO(Application Hosting Operations), ISMT (Infrastructure Management), Intelligent Automation The Product Development Management team manages the end-to-end product development process from conception to design and production start-up, including the product structure design, engineering requirement process, multi-function resources collaboration and the engineering and supply chain integration. The team is also responsible for driving the technology design meetings with leadership, propose technology design and architecture changes, determine technical changes, schedule projects, resources, and monitor project timelines. What are we looking for Expert in executive presentations, client orals, and online presentations targeting multiple stakeholders Experience in working with Other Product Managers and Functional Owners for a common goal of establishment of Functional and Technology Roadmaps. Experience in working in a matrixed organization and comfortable in coordination/reporting to/supervising higher or lower-level resources in a TEAM setup. Experience influencing indirect associates/Vendors/Suppliers for operational success Experience in Product Management by applying Product Management principles Should have experience in Multiple domains in launching/acquiring new products/offerings Solid experience in working with client/customer management teams to achieve product objectives Should have worked in envisioning, assessing, contracting, and onboarding products off the shelf for accelerating the goal of establishing a foothold. Roles and ResponsibilitiesIn this role you are required to identify and assess complex problems for area of responsibility The person would create solutions in situations in which analysis requires an in-depth evaluation of variable factors Requires adherence to strategic direction set by senior management when establishing near-term goals Interaction of the individual is with senior management at a client and/or within Accenture, involving matters that may require acceptance of an alternate approach Some latitude in decision-making in involved you will act independently to determine methods and procedures on new assignments Decisions individual at this role makes have a major day to day impact on area of responsibility The person manages large - medium sized teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Roles and Responsibilities: In this role you are required to identify and assess complex problems for area of responsibility The person would create solutions in situations in which analysis requires an in-depth evaluation of variable factors Requires adherence to strategic direction set by senior management when establishing near-term goals Interaction of the individual is with senior management at a client and/or within Accenture, involving matters that may require acceptance of an alternate approach Some latitude in decision-making in involved you will act independently to determine methods and procedures on new assignments Decisions individual at this role makes have a major day to day impact on area of responsibility The person manages large - medium sized teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Qualification Any Graduation
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
As the heart of our business is driven by technology, we attribute our success to our global and diverse culture. At Quantiphi, we cherish our people and take pride in fostering a culture that thrives on transparency, diversity, integrity, learning, and growth. If the idea of working in an environment that not only encourages you to innovate and excel professionally but also supports your personal growth interests you, then a career with Quantiphi is the right fit for you! Responsibilities: - Creating and maintaining LookML code to define data models, dimensions, measures, and relationships within Looker. - Developing reusable LookML components to ensure consistency and efficiency in report and dashboard creation. - Building and customizing dashboards to incorporate data visualizations like charts and graphs to effectively present insights. - Writing complex SQL queries when necessary to extract and manipulate data from underlying databases and optimizing SQL queries for performance. - Identifying and addressing bottlenecks affecting report and dashboard loading times, and optimizing Looker performance by tuning queries, caching strategies, and exploring indexing options. - Configuring user roles and permissions within Looker to control access to sensitive data and implementing data security best practices, including row-level and field-level security. - Demonstrating a good understanding of Looker API, SDK, and extension framework. - Using version control systems like Git to manage LookML code changes and collaborating with other developers. - Providing training and support to business users to help them effectively navigate and use Looker. - Diagnosing and resolving technical issues related to Looker, data models, and reports. Skills Required: - Experience in Looker modeling language, LookML, including defining data models, dimensions, and measures. - Strong SQL skills for writing and optimizing database queries. - Understanding of different SQL database systems and dialects (GCP/BQ preferable). - Knowledge of data modeling best practices. - Proficiency in ETL processes for data transformation and preparation. - Skill in creating visually appealing and effective data visualizations using Looker dashboard and reporting tools. - Ability to optimize Looker performance by fine-tuning queries, caching strategies, and indexing options. - Familiarity with related tools and technologies such as data warehousing (e.g., BigQuery), data transformation tools (e.g., Apache Spark), and scripting languages (e.g., Python). - Strong analytical and problem-solving skills. - Knowledge of data governance principles and practices to ensure data quality, privacy, and compliance within Looker. - Willingness to stay updated on Looker's latest features and best practices in data analytics and BI. - Creating and maintaining efficient data models. - Fine-tuning queries and optimizing the overall performance of Looker dashboards. - Providing training and support to end-users, helping them understand how to use Looker effectively for data analysis and decision-making. - Excellent understanding of advanced Looker concepts - liquid, data security, complex derived tables, caching / PDTs, etc. - Troubleshooting issues related to data modeling, queries, and dashboards, identifying root causes, and implementing effective solutions to resolve them. If you are passionate about wild growth and enjoy working with happy, enthusiastic over-achievers, your career at Quantiphi promises to be a fulfilling journey!,
Posted 1 week ago
2.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
You are looking for an Ecommerce SME Analyst with 8 years of expertise in digital analytics and ecommerce data. In this role, your responsibilities will include analyzing clickstream and user behavior data, utilizing tools such as Adobe Analytics, Python, SQL, and BigQuery to uncover insights that enhance user experience, optimize conversion funnels, and inform strategic product decisions. You will collaborate with cross-functional teams to drive data-informed growth across the ecommerce platform. Your key responsibilities will involve analyzing ecommerce clickstream data, segmenting and analyzing user behavior, extracting and transforming data using SQL and Python, building dashboards and reports for visualization, supporting product strategy, defining and tracking KPIs, analyzing A/B tests, collaborating with cross-functional teams, ensuring data quality, and continuously learning about industry trends and best practices. To qualify for this role, you should have a Bachelor's degree in Statistics, Mathematics, Computer Science, Economics, or a related field, along with a minimum of 2 years of experience in ecommerce analytics. Proficiency in Adobe Analytics, SQL, Python, Google BigQuery, and data visualization tools is required. Strong analytical, problem-solving, communication, and collaboration skills are essential for success in this position. If you are passionate about ecommerce analytics and data science, have hands-on experience with key tools and technologies, and possess a strong ability to translate data insights into actionable recommendations, we encourage you to apply for this exciting opportunity at LTIMindtree Ltd., a global technology consulting and digital solutions company committed to driving business transformation for clients across industries.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Engineer at UST, you will play a crucial role in designing, building, and maintaining scalable data pipelines and ETL processes using GCP services such as Cloud Dataflow, Cloud Dataproc, and BigQuery. Your responsibilities will include implementing and optimizing data storage solutions using GCP technologies like Cloud Storage, Cloud SQL, and Cloud Spanner, as well as developing and maintaining data warehouses and data lakes on GCP to ensure data quality, accessibility, and security. Collaboration with data scientists and analysts will be essential to understand data requirements and provide efficient data access solutions. You will also need to implement data governance and security measures to ensure compliance with regulations and best practices. Automation of data workflows and implementation of monitoring and logging systems for data pipelines will be part of your daily tasks. Sharing data engineering knowledge with the wider functions and developing reusable data integration patterns and best practices will also be expected from you. To excel in this role, you should have a BSc/MSc in Computer Science, Information Systems, or a related field, or equivalent work experience. Having proven experience (5+ years) as a Data Engineer or in a similar role, preferably with GCP expertise, will be advantageous. Strong proficiency in SQL and experience with NoSQL databases is required, along with expertise in data modeling, ETL processes, and data warehousing concepts. Significant experience with GCP services such as BigQuery, Dataflow, Dataproc, Cloud Storage, and Pub/Sub is essential. Proficiency in at least one programming language (e.g., Python, Java, or Scala) for data pipeline development is necessary, as well as experience with big data technologies such as Hadoop, Spark, and Kafka. Knowledge of data governance, security, and compliance best practices is also important. GCP certifications (e.g., Professional Data Engineer) are highly advantageous. Effective communication skills are crucial to collaborate with cross-functional teams and explain technical concepts to non-technical stakeholders. Join UST, a global digital transformation solutions provider, and be part of a team that works side by side with the world's leading companies to make a real impact through transformation. With over 30,000 employees in 30 countries, UST is committed to embedding innovation and agility into their clients" organizations for boundless impact, touching billions of lives in the process.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Data Specialist, you will be responsible for utilizing your expertise in ETL Fundamentals, SQL, BigQuery, Dataproc, Python, Data Catalog, Data Warehousing, and various other tools to contribute to the successful implementation of data projects. Your role will involve working with technologies such as Cloud Trace, Cloud Logging, Cloud Storage, and Datafusion to build and maintain a modern data platform. To excel in this position, you should possess a minimum of 5 years of experience in the data engineering field, with a focus on GCP cloud data implementation suite including BigQuery, Pub Sub, Data Flow/Apache Beam, Airflow/Composer, and Cloud Storage. Your strong understanding of very large-scale data architecture and hands-on experience in data warehouses, data lakes, and analytics platforms will be crucial for the success of our projects. Key Requirements: - Minimum 5 years of experience in data engineering - Hands-on experience in GCP cloud data implementation suite - Strong expertise in GBQ Query, Python, Apache Airflow, and SQL (BigQuery preferred) - Extensive hands-on experience with SQL and Python for working with data If you are passionate about data and have a proven track record of delivering results in a fast-paced environment, we invite you to apply for this exciting opportunity to be a part of our dynamic team.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
The company believes in conducting business every day based on core values of Inclusion, Innovation, Collaboration, and Wellness, ensuring a global team works together with customers at the center. As part of the team, you will have the opportunity to impact the business by identifying AI ML opportunities and building solutions that drive results. You will lead ML projects, conduct research to discover new ML techniques, and innovate to enhance team and business efficiencies. Collaborating closely with engineers, analysts, and leaders, you will implement and optimize ML models, establish best practices for model management, deployment, and monitoring, and integrate ML models into products and services. Additionally, you will assist in troubleshooting technical issues, maintain documentation, project tracking, and quality controls. The ideal candidate will have a degree in engineering, science, statistics, or mathematics, possessing a strong technical background in machine learning. Excellent communication skills, an analytical mindset, and a passion for problem-solving are essential. Candidates should have at least 3 years of hands-on experience in problem-solving using Machine Learning, proficiency in Python or Java, and familiarity with technologies like Spark, Hadoop, BigQuery, and SQL. Deep knowledge of machine learning algorithms, explainable AI methods, GenAI, and NLP is required, along with experience with Cloud frameworks such as GCP and AWS. Experience in Lending and Financial services is considered a plus. The company offers a range of benefits, and it is committed to Diversity and Inclusion. To understand more about the company's culture and community, visit https://about.pypl.com/who-we-are/default.aspx. If you are interested in joining the Talent Community or have any questions related to your skills, please don't hesitate to apply, as the company values all candidates and aims to bridge the confidence gap and imposter syndrome.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
About GlobalLogic GlobalLogic, a leader in digital product engineering with over 30,000 employees, helps brands worldwide in designing and developing innovative products, platforms, and digital experiences. By integrating experience design, complex engineering, and data expertise, GlobalLogic assists clients in envisioning possibilities and accelerating their transition into the digital businesses of tomorrow. Operating design studios and engineering centers globally, GlobalLogic extends its deep expertise to customers in various industries such as communications, financial services, automotive, healthcare, technology, media, manufacturing, and semiconductor. GlobalLogic is a Hitachi Group Company. Requirements Leadership & Strategy As a part of GlobalLogic, you will lead and mentor a team of cloud engineers, providing technical guidance and support for career development. You will define cloud architecture standards and best practices across the organization and collaborate with senior leadership to develop a cloud strategy and roadmap aligned with business objectives. Your responsibilities will include driving technical decision-making for complex cloud infrastructure projects and establishing and maintaining cloud governance frameworks and operational procedures. Leadership Experience With a minimum of 3 years in technical leadership roles managing engineering teams, you should have a proven track record of successfully delivering large-scale cloud transformation projects. Experience in budget management, resource planning, and strong presentation and communication skills for executive-level reporting are essential for this role. Certifications (Preferred) Preferred certifications include Google Cloud Professional Cloud Architect, Google Cloud Professional Data Engineer, and additional relevant cloud or security certifications. Technical Excellence You should have over 10 years of experience in designing and implementing enterprise-scale Cloud Solutions using GCP services. As a technical expert, you will architect and oversee the development of sophisticated cloud solutions using Python and advanced GCP services. Your role will involve leading the design and deployment of solutions utilizing Cloud Functions, Docker containers, Dataflow, and other GCP services. Additionally, you will design complex integrations with multiple data sources and systems, implement security best practices, and troubleshoot and resolve technical issues while establishing preventive measures. Job Responsibilities Technical Skills Your expertise should include expert-level proficiency in Python and experience in additional languages such as Java, Go, or Scala. Deep knowledge of GCP services like Dataflow, Compute Engine, BigQuery, Cloud Functions, and others is required. Advanced knowledge of Docker, Kubernetes, and container orchestration patterns, along with experience in cloud security, infrastructure as code, and CI/CD practices, will be crucial for this role. Cross-functional Collaboration Collaborating with C-level executives, senior architects, and product leadership to translate business requirements into technical solutions, leading cross-functional project teams, presenting technical recommendations to executive leadership, and establishing relationships with GCP technical account managers are key aspects of this role. What We Offer At GlobalLogic, we prioritize a culture of caring, continuous learning and development, interesting and meaningful work, balance and flexibility, and a high-trust organization. Join us to experience an inclusive culture, opportunities for growth and advancement, impactful projects, work-life balance, and a safe, reliable, and ethical global company. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for creating innovative digital products and experiences since 2000. Collaborating with forward-thinking companies globally, GlobalLogic continues to transform businesses and redefine industries through intelligent products, platforms, and services.,
Posted 1 week ago
12.0 - 18.0 years
0 Lacs
navi mumbai, maharashtra
On-site
As a Data Engineer Architect with 12-18 years of experience, you will have the opportunity to work remotely and showcase your expertise in various aspects of data architecture. You will be responsible for ensuring a strong understanding of customer data models, behavioral analytics, segmentation, and machine learning models. Your experience with API integration, real-time event processing, and data pipelines will be instrumental in this role. Your prior experience working in ETL (Extract, Transform, Load) and Data Warehousing (DWH) is essential for this position. Additionally, your proficiency in designing and implementing solutions within cloud environments such as GCP (Google Cloud Platform) and Google CDP data platforms (e.g., Snowflake, BigQuery) is a must-have requirement. In this role, you will be expected to develop customer-facing user interfaces using BI Tools like Google Looker, Power BI, or any other open-source tools. Your experience in Agile delivery, coupled with self-motivation, creativity, and strong communication and interpersonal skills, will be key assets in this position. As a motivated self-starter, you should be able to adapt quickly to changing priorities and think critically to design and deliver effective solutions. If you have prior experience with Segment CDP platform development, it will be considered a valuable advantage in this role.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
punjab
On-site
As a Digital Marketing Coordinator at TRU, a Global Leading organization dedicated to leveraging cutting-edge technology to drive business innovation and growth, you will be instrumental in driving revenue growth, customer acquisition, and brand awareness through data-backed digital strategies. You will be responsible for executing digital campaigns, optimizing performance, and reporting across various platforms. Your key responsibilities will include owning the end-to-end process of digital campaign execution, troubleshooting tracking and reporting issues using tools like Google Tag Manager and Google Analytics, developing visual performance dashboards in Looker Studio, implementing efficient campaign measurement processes, monitoring campaign performance across Google Ads, Meta (Facebook/Instagram), and other platforms, identifying emerging trends and customer insights, analyzing sales funnels and behavior trends, researching and implementing new tools and technologies, and reporting regularly on core marketing KPIs. To excel in this role, you should have strong hands-on experience with Google Analytics, Google Tag Manager, Looker Studio, BigQuery, and Google Ads. Proficiency in managing and optimizing Facebook Ads and Instagram campaigns, a solid understanding of digital marketing best practices, exceptional analytical and critical thinking skills, ability to manage multiple campaigns and deadlines with minimal supervision, and strong verbal and written communication skills in English are essential. A digital-first mindset, the ability to interpret data into strategic insights, and a strong portfolio of managing high-performance digital campaigns will be advantageous. In return, TRU offers a competitive salary and benefits package, opportunities for professional growth and development, and a collaborative and innovative work environment. Join us in transforming businesses through strategic and creative digital solutions, and be a part of our journey towards digital excellence.,
Posted 1 week ago
3.0 - 6.0 years
0 - 0 Lacs
Chennai
Work from Office
Who we are looking for: We are seeking a skilled and motivated engineer to join our Data Infrastructure team. The Data Infrastructure engineering team is responsible for the tools and backend infrastructure that supports our data platform to optimize in performance scalability and reliability. This role requires strong focus and experience in multi-cloud based technologies, message bus systems, automated deployments using containerized applications, design, development, database management and performance, SOX compliance requirements, and implementation of infrastructure using automation through terraform and continuous delivery and batch-oriented workflows. As a Data Infrastructure Engineer at ACV Auctions, you will work alongside and mentor software and production engineers in the development of solutions to ACV’s most complex data and software problems. You will be an engineer that is able to operate in a high performing team, that can balance high quality deliverables with customer focus, have excellent communication skills, desire and ability to mentor and guide engineers, and have a record of delivering results in a fast paced environment. It is expected that you are a technical liaison that you can balance high quality delivery with customer focus, that you have excellent communication skills, and that you have a record of delivering results in a fast-paced environment. What you will be doing: Collaborate with cross-functional teams, including Data Scientists, Software Engineers, Data Engineers, and Data Analysts, to understand data requirements and translate them into technical specifications. Influence company wide engineering standards for databases, tooling, languages, and build systems. Design, implement, and maintain scalable and high-performance data infrastructure solutions, with a primary focus on data. Design, implement, and maintain tools and best practices for (but not limited to) access control, data versioning, database management, and migration strategies. Contribute, influence, and set standards for all technical aspects of a product or service including, but not limited to, coding, testing, debugging, performance, languages, database selection, management and deployment. Identify and troubleshoot database/system issues and bottlenecks, working closely with the engineering team to implement effective solutions. Write clean, maintainable, well-commented code and automation to support our data infrastructure layer. Perform code reviews, develop high-quality documentation, and build robust test suites for your products. Provide technical support for databases, including troubleshooting, performance tuning, and resolving complex issues. Collaborate with software and DevOps engineers to design scalable services, plan feature roll-out, and ensure high reliability and performance of our products. Collaborate with development teams and data science teams to design and optimize database schemas, queries, and stored procedures for maximum efficiency. Participate in the SOX audits, including creation of standards and reproducible audit evidence through automation Create and maintain documentation for database and system configurations, procedures, and troubleshooting guides. Maintain and extend (as required) existing database operations solutions for backups, index defragmentation, data retention, etc. Respond-to and troubleshoot highly complex problems quickly, efficiently, and effectively. Accountable for the overall performance of products and/or services within a defined area of focus. Be part of the on-call rotation. Handle multiple competing priorities in an agile, fast-paced environment. Perform additional duties as assigned What you need: Bachelor’s degree in computer science, Information Technology, or a related field (or equivalent work experience) Ability to read, write, speak, and understand English Strong communication and collaboration skills, with the ability to work effectively in a fast paced global team environment 1+ years experience architecting, developing, and delivering software products with emphasis on data infrastructure layer 1+ years work with continuous integration and build tools. 1+ years experience programing in Python 1+ years experience with Cloud platforms preferably in GCP/AWS Knowledge in day-day tools and how they work including deployments, k8s, monitoring systems, and testing tools. Knowledge in version control systems including trunk-based development, multiple release planning, cherry picking, and rebase. Hands-on skills and the ability to drill deep into the complex system design and implementation. Experience with: DevOps practices and tools for database automation and infrastructure provisioning Programming in Python, SQL Github, Jenkins Infrastructure as code tooling, such as terraform, preferred Big data technologies and distributed databases Nice to Have Qualifications: Experience with NoSQL data stores Airflow, Docker, Containers, Kubernetes, DataDog, Fivetran Database monitoring and diagnostic tools, preferably Data Dog. Database management/administration with PostgreSQL, MySQL, Dynamo, Mongo GCP/BigQuery, Confluent Kafka Using and integrating with cloud services, specifically: AWS RDS, Aurora, S3, GCP Service Oriented Architecture/Microservices and Event Sourcing in a platform like Kafka (preferred) Familiarity with DevOps practices and tools for automation and infrastructure provisioning. Hands-on experience with SOX compliance requirements Knowledge of data warehousing concepts and technologies, including dimensional modeling and ETL frameworks Knowledge of database design principles, data modeling, architecture, infrastructure, security principles, best practices, performance tuning and optimization techniques #LI-AM1
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough