Jobs
Interviews

1356 Bigquery Jobs - Page 12

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

As a Senior Data Scientist in the Global Data Science & Advanced Analytics team at Colgate-Palmolive, your role will involve leading projects within the Analytics Continuum. You will be responsible for conceptualizing and developing machine learning, predictive modeling, simulations, and optimization solutions to address business questions with clear dollar objectives. Your work will have a significant impact on revenue growth management, price elasticity, promotion analytics, and marketing mix modeling. Your responsibilities will include: - Conceptualizing and building predictive modeling solutions to address business use cases - Applying machine learning and AI algorithms to develop scalable solutions for business deployment - Developing end-to-end business solutions from data extraction to statistical modeling - Conducting model validations and continuous improvement of algorithms - Deploying models using Airflow and Docker on Google Cloud Platforms - Leading pricing, promotion, and marketing mix initiatives from scoping to delivery - Studying large datasets to discover trends and patterns - Presenting insights in a clear and interpretable manner to business teams - Developing visualizations using frameworks like Looker, PyDash, Flask, PlotLy, and streamlit - Collaborating closely with business partners across different geographies To qualify for this position, you should have: - A degree in Computer Science, Information Technology, Business Analytics, Data Science, Economics, or Statistics - 5+ years of experience in building statistical models and deriving insights - Proficiency in Python and SQL for coding and statistical modeling - Hands-on experience with statistical models such as linear regression, random forest, SVM, logistic regression, clustering, and Bayesian regression - Knowledge of GitHub, Airflow, and visualization frameworks - Understanding of Google Cloud and related services like Kubernetes and Cloud Build Preferred qualifications include experience with revenue growth management, pricing, marketing mix models, and third-party data. Knowledge of machine learning techniques and Google Cloud products will be advantageous for this role. Colgate-Palmolive is committed to fostering an inclusive environment where diversity is valued, and every individual is treated with respect. As an Equal Opportunity Employer, we encourage applications from candidates with diverse backgrounds and perspectives. If you require accommodation during the application process due to a disability, please complete the request form provided. Join us in building a brighter, healthier future for all.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Modeller with 6-9 years of experience, you will be responsible for hands-on data modelling for both OLTP and OLAP systems. Your role will involve having an in-depth knowledge of Conceptual, Logical, and Physical data modelling, along with a strong understanding of indexing, partitioning, and data sharding. Practical experience in these areas is essential. You will need to possess a strong understanding of variables that impact database performance, specifically for near-real-time reporting and application interaction. It is expected that you have working experience with at least one data modelling tool, with a preference for DBSchema. Additionally, individuals with functional knowledge of the mutual fund industry will be considered a plus. Having a good understanding of GCP databases such as AlloyDB, CloudSQL, and BigQuery is necessary for this role. This position is full-time and requires you to work from the client's office in Chennai. Benefits include health insurance and the opportunity to work from home. The work schedule is during the day shift, with the work location being in person. If you meet the above requirements and are looking to contribute your expertise in data modelling within a dynamic environment, we encourage you to apply for this position.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

maharashtra

On-site

The role of Staff Engineer - Data in SonyLIV's Digital Business is to lead the data engineering strategy, architect scalable data infrastructure, drive innovation in data processing, ensure operational excellence, and build a high-performance team to enable data-driven insights for OTT content and user engagement. This position is based in Mumbai and requires a minimum of 8 years of experience in the field. Responsibilities include defining the technical vision for scalable data infrastructure using modern technologies like Spark, Kafka, Snowflake, and cloud services, leading innovation in data processing and architecture through real-time data processing and streaming analytics, ensuring operational excellence in data systems by setting and enforcing standards for data reliability and privacy, building and mentoring a high-caliber data engineering team, collaborating with cross-functional teams, and driving data quality and business insights through automated quality frameworks and BI dashboards. The successful candidate should have 8+ years of experience in data engineering, business intelligence, and data warehousing, with expertise in high-volume, real-time data environments. They should possess a proven track record in building and managing large data engineering teams, designing and implementing scalable data architectures, proficiency in SQL, experience with object-oriented programming languages, and knowledge of A/B testing methodologies and statistical analysis. Preferred qualifications include a degree in a related technical field, experience managing the end-to-end data engineering lifecycle, working with large-scale infrastructure, familiarity with automated data lineage and auditing tools, expertise with BI and visualization tools, and advanced processing frameworks. Joining SonyLIV offers the opportunity to drive the future of data-driven entertainment by collaborating with industry professionals, working with comprehensive data sets, leveraging cutting-edge technology, and making a tangible impact on product delivery and user engagement. The ideal candidate will bring a strong foundation in data infrastructure, experience in leading and scaling data teams, and a focus on operational excellence to enhance efficiency.,

Posted 2 weeks ago

Apply

5.0 - 8.0 years

6 - 10 Lacs

Pune

Hybrid

Mandatory Skills: Cloud-PaaS-GCP-Google Cloud Platform . Location: Wipro PAN India Hybrid 3 days in Wipro office JD: Strong - SQL Strong - Python Any cloud technology (AWS, azure, GCP etc) have to be excellent GCP (preferred) PySpark (preferred) Essential Skills: Proficiency in Cloud-PaaS-GCP-Google Cloud Platform. Experience Required: 5-8 years. Position: Cloud Data Engineer. Work Location: Wipro, PAN India. Work Arrangement: Hybrid model with 3 days in Wipro office. Additional Experience: 8-13 years. Job Description: - Strong expertise in SQL. - Proficient in Python. - Excellent knowledge of any cloud technology (AWS, Azure, GCP, etc.), with a preference for GCP. - Familiarity with PySpark is preferred. Mandatory Skills: Cloud-PaaS-GCP-Google Cloud Platform . JD: Strong - SQL Strong - Python Any cloud technology (AWS, azure, GCP etc) have to be excellent GCP (preferred) PySpark (preferred

Posted 2 weeks ago

Apply

4.0 - 6.0 years

7 - 12 Lacs

Hyderabad

Work from Office

As a Senior Cloud Data Platform (GCP) Engineer at Incedo, you will be responsible for managing and optimizing the Google Cloud Platform environment, ensuring its performance, scalability, and security. You will work closely with data analysts and data scientists to develop data pipelines and run data science experiments. You will be skilled in cloud computing platforms such as AWS or Azure and have experience with big data technologies such as Hadoop or Spark. You will be responsible for configuring and optimizing the GCP environment, ensuring that data pipelines are efficient and accurate, and troubleshooting any issues that arise. You will also work with the security team to ensure that the GCP environment is secure and complies with relevant regulations. Roles & Responsibilities: Designing, developing and deploying cloud-based data platforms using Google Cloud Platform (GCP) Integrating and processing large amounts of structured and unstructured data from various sources Implementing and optimizing ETL processes and data pipelines Developing and maintaining security and access controls Collaborating with other teams to ensure the consistency and integrity of data Troubleshooting and resolving data platform issues Technical Skills Skills Requirements: In-depth knowledge of GCP services and tools such as Google Cloud Storage, Google BigQuery, and Google Cloud Dataflow Experience in building scalable and reliable data pipelines using GCP services, Apache Beam, and related big data technologies Familiarity with cloud-based infrastructure and deployment, specifically on GCP Strong knowledge of programming languages such as Python, Java, and SQL Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 2 weeks ago

Apply

15.0 - 20.0 years

40 - 50 Lacs

Pune, Bengaluru

Work from Office

Role : Enterprise Architect (Data) Location : Bangalore Role overview We are seeking an experienced Enterprise Data Architect to join our team, focusing primarily on Data engineering, Data Analytics/Visualization and cloud engineering. The ideal candidate will play a crucial role in shaping our technology landscape, participating in project delivery, and contributing to presales activities during lean periods. 14 to 20 years of total experience in IT, with a minimum of 3 years in an Enterprise Data Architect capacity. Strong expertise in data and cloud technologies, with hands-on experience in data architecture, cloud migrations, and modern data platforms. Knowledge of design patterns and architectural styles. Experience with data modeling and database design. Experience with Google Cloud Platform is a MUST, has used Google products on the data side; multi-cloud expertise is a plus. Proven track record of designing and implementing large-scale, complex systems. Familiarity with modern data tools, such as dbt, Snowflake, and Kafka, and proficiency in SQL and Python. Excellent communication skills, with the ability to convey complex technical concepts to both technical and non-technical audiences. Strong leadership and mentoring skills. What would you do here Design and oversee enterprise-wide Data engineering, Data modeling, Data Analytics, and cloud architecture based solutions. Lead and participate in large-scale projects, integrating solutions across cloud, data engineering, and Analytics practices. Engage in customer-facing roles, including presales activities and project delivery. Develop robust data governance frameworks, ensuring compliance with regulations like GDPR, CCPA, or other industry standards. Collaborate with cross-functional teams to ensure alignment of technology solutions with business objectives. Stay current with emerging technologies and industry trends, particularly in cloud computing. Build reusable assets, frameworks, and accelerators to enhance delivery efficiency. Participate in and potentially lead architectural reviews and governance processes.

Posted 2 weeks ago

Apply

6.0 - 9.0 years

12 - 20 Lacs

Bhubaneswar, Hyderabad

Work from Office

Design scalable data systems, develop analytics-ready models, build ETL pipelines, manage SQL/NoSQL DBs, integrate diverse data sources, orchestrate workflows (Airflow/Glue), and collaborate with teams. Skilled in Databricks, BigQuery, SQL, Python.

Posted 2 weeks ago

Apply

2.0 - 4.0 years

6 - 11 Lacs

Bengaluru

Work from Office

Zeta Global is looking for an experienced Machine Learning Engineer with industry-proven hands-on experience of delivering machine learning models to production to solve business problems. To be a good fit to join our AI/ML team, you should ideally: Be a thought leader that can work with cross-functional partners to foster a data-driven organisation. Be a strong team player, have experience contributing to a large project as part of a collaborative team effort. Have extensive knowledge and expertise with machine learning engineering best-practices and industry standards. Empower the product and engineering teams to make data-driven decisions. What you need to succeed: 2 to 4 years of proven experience as a Machine Learning Engineer in a professional setting. Proficiency in any programming language (Python preferable). Prior experience in building and deploying Machine learning systems. Experience with containerization: Docker & Kubernetes. Experience with AWS cloud services like EKS, ECS, EMR, Lambda, and others. Fluency with workflow management tools like Airflow or dbt. Familiarity with distributed batch compute technologies such as Spark. Experience with modern data warehouses like Snowflake or BigQuery. Knowledge of MLFlow, Feast, and Terraform is a plus.

Posted 2 weeks ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Mumbai, Chennai

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 2 weeks ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Pune

Work from Office

Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Do: 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Mandatory Skills: Google Kubernetes Engine Experience : 5-8 Years.

Posted 2 weeks ago

Apply

1.0 - 5.0 years

3 - 7 Lacs

Chennai

Work from Office

Role Finance Controller Lead DO Lead cross global functional teams in developing finance strategies to support a strategic alignment with companys Business Operations, and Corporate departments on company goals & initiatives. Manage financial goals that result in strong customer satisfaction, align with company strategy, and optimize costs and supplier relations. Influence senior leaders in setting direction for their functional areas by linking finance and business strategies to optimize business results.

Posted 2 weeks ago

Apply

3.0 - 4.0 years

14 - 18 Lacs

Bengaluru

Work from Office

Shift: Night shifts (EST) Description: Principal Data Analyst will be responsible for analyzing complex datasets, identifying opportunities for process improvements, and implementing automation solutions to streamline workflows. This role requires a deep understanding of data analytics, process automation tools, and excellent problem-solving skills. The ideal candidate will be proactive, detail-oriented, and able to work collaboratively with cross-functional teams to drive data-driven initiatives. What youll do: Analyze large and complex datasets to identify trends, patterns, and insights that drive business decisions. Develop, implement, and maintain automated processes to improve data accuracy, efficiency,and reporting capabilities. Collaborate with stakeholders to understand business requirements and translate them into technical solutions. Design and build automated dashboards and reports to provide real-time insights to various departments. Utilize data visualization tools to present findings in a clear and actionable manner. Continuously monitor and refine automated processes to ensure optimal performance and scalability. Stay updated with industry trends and best practices in data analytics and process automation. Mentor and provide guidance to junior data analysts on best practices and technical skills. Who you are: A great communicator who can convey complex technical features in simple terms. Able to multitask and prioritize among several high-profile clients. Have a high degree of creativity, self-motivation, and drive. Eagerness to work in a startup team environment that will be rapidly changing. Enthusiastic team player with a penchant for collaboration and knowledge sharing. Willingness to do whatever it takes to get the job done. Nerdy but loveable. Data driven, technical, self-starting and curious. What you need: Bachelors or Masters degree in data science, Computer Science, Statistics, or a related field. Minimum of 3-4 years of experience in data analysis, with a focus on process automation. A minimum of 2 years of work experience in analytics (minimum of 1 year with a Ph.D.) Experience with data querying languages (e.g. SQL), scripting languages (e.g. Python), and/or statistical/mathematical software (e.g. R) Experience with combining and consolidating disparate datasets in apps such as Big Query, Data Bricks Proficiency in programming languages such as Python, R, or SQL. Extensive experience with data visualization tools such as Tableau, Power BI or similar. Strong knowledge of process automation tools and platforms (e.g., Alteryx, UiPath, Microsoft Power Automate). Experience with database management systems (e.g., SQL Server, MySQL, PostgreSQL). Excellent analytical and problem-solving skills. Ability to work effectively in a fast-paced, collaborative environment. Strong communication skills, with the ability to convey complex data insights to non-technical stakeholders. Experience with machine learning and predictive analytics is a plus.

Posted 2 weeks ago

Apply

5.0 - 7.0 years

5 - 7 Lacs

Chennai

Work from Office

We are seeking a midlevel GCP data engineerwith 4+yrs experince inETL, Data Warehousing and Data Engineering.The ideal candidate will have handson experience inGCP tools, solid data analysis skills.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

30 - 35 Lacs

Kolkata, New Delhi, Bengaluru

Work from Office

Good hands on experience working as a GCP Data Engineer with very strong experience in SQL and PySpark. Also on BigQuery, Dataform, Dataplex, etc. Looking for only Immediate to currently serving candidates.

Posted 2 weeks ago

Apply

3.0 - 7.0 years

8 - 15 Lacs

Pune

Hybrid

Job Description We are Hiring for ETL Engineer with GCP Location: India (Pune) Exp: 3 - 7 Years Required Skills and Qualifications: 3+ years of experience in Data Engineering roles. Strong hands-on experience with Google Cloud Platform (GCP) data services, specifically BigQuery, Cloud Composer (Apache Airflow), and Cloud Storage. Mandatory expertise in Apache Airflow , including designing, developing, and deploying complex DAGs . Mandatory strong proficiency in SQL and PL/SQL for data manipulation, stored procedures, functions, and complex query writing. Experience into Informatica Ability to optimize BigQuery queries for performance and cost. Familiarity with version control systems (e.g., Git). Excellent problem-solving, analytical, and communication skills. Ability to work independently and collaboratively in an agile environment. Bachelor's degree in Computer Science, Engineering, or a related field.

Posted 2 weeks ago

Apply

4.0 - 6.0 years

8 - 10 Lacs

Gurugram

Work from Office

Job Location: Gurugram Night shift (11pm onwards) Job Description: We are looking for Data Analyst with expertise in Data Studio, SFDC (Sales Force) and Google Sheets, BigQuery, Google Cloud Platform skills be included as well . We are looking for a Data Analyst with strong Bigquery SQL and Advance Excel skills and expertise in Looker Studio to drive data-driven insights and reporting. The ideal candidate will work closely with stakeholders to analyze data, create dashboards, and optimize reporting processes. Responsibilities: Write and optimize complex SQL queries for data analysis. Develop interactive dashboards and reports in Looker Studio . Extract, transform, and analyze data to support business decisions. Work with cross-functional teams to improve data visualization and reporting efficiency. Ensure data accuracy and integrity in all reports. Requirements: 4+ years of experience in data analysis. Strong proficiency in Bigquery SQL & Advance Excel (Joins, CTEs, Window Functions, etc.). Hands-on experience Data Studio, SFDC (Sales Force) and Google Sheets, Google Cloud Platform skills

Posted 2 weeks ago

Apply

6.0 - 8.0 years

18 - 30 Lacs

Hyderabad

Hybrid

Key Skills: Data engineering, Apache Airflow, GCP, BigQuery, GCS, SQL, ETL/ELT, Docker, Kubernetes, data governance, Agile, CI/CD, DevOps, pipeline orchestration, technical leadership. Roles & Responsibilities: Evaluate and provide scalable technical solutions to address complex and interdependent data processes. Ensure data quality and accuracy by implementing data quality checks, data contracts, and governance processes. Collaborate with software development teams and business analysts to understand data requirements and deliver fit-for-purpose data solutions. Lead the team in delivering end-to-end data engineering solutions. Design, develop, and maintain complex applications to support data processing workflows. Develop and manage data pipelines and workflows using Apache Airflow on GCP. Integrate data from various sources into Google BigQuery and Google Cloud Storage (GCS). Write and optimize advanced SQL queries for ETL/ELT processes. Maintain data consistency and troubleshoot issues in data workflows. Create and maintain detailed technical documentation for pipelines and workflows. Mentor junior data engineers and provide technical leadership and support. Lead project planning, execution, and successful delivery of data engineering initiatives. Stay updated with emerging trends and technologies in data engineering and cloud computing. Experience Requirement: 6-8 yeras of experience in leading the design, development, and deployment of complex data pipelines. Strong working knowledge of Apache Airflow on GCP for orchestration. Hands-on experience integrating data into Google BigQuery and GCS from various sources. Proficient in writing and optimizing complex SQL queries for large-scale data processing. Practical knowledge of containerization technologies like Docker and Kubernetes. Experience in implementing data governance and adhering to data security best practices. Familiarity with Agile methodology and working in cross-functional teams. Experience with CI/CD pipelines and DevOps practices for data engineering workflows. Education: B.Tech M.Tech (Dual), B.Tech, M. Tech.

Posted 2 weeks ago

Apply

8.0 - 13.0 years

15 - 30 Lacs

Mumbai

Work from Office

We have a suitable opportunity in the Beauty Tech division for the position of Data Product Manager Data Engineering & Data Science to join our expanding Data Analytics Domain team . This critical role will shape the future of data-driven decisions at LOral, leveraging data engineering, machine learning, and advanced analytics to deliver innovative customer experiences. The location of the job will be in Mumbai HO. Key Responsibilities: As Data Product Manager: Data Engineering & Data Science , you will own the product vision, strategy, and roadmap for our internal data & analytics tools and machine learning product capabilities. You will act as the tech partner between business stakeholders and data engineering, data science and design teams bringing cutting-edge data products to life. This requires a deep understanding of LOrals business objectives, translating them into functional, user-friendly solutions. Strategy & Roadmap: Define and champion the technology vision, strategy, and roadmap for data analytics products & use cases with diverse complexities that include Dashboards, Reports to complex data products, incorporating machine learning and Generative AI. This includes identifying market opportunities, defining target audiences, and prioritizing product features. Requirements & Analysis: Collaborate with stakeholders across business units (Commerce, Transformation, Marketing, Supply Chain, etc.) to gather and analyze requirements, translating user needs into detailed product specifications. Development & Execution: Oversee the entire product lifecycle from ideation to launch ensuring timely and efficient delivery of high-quality products by working closely with data engineering, design, and QA teams. Data Platform Run & Change Management: Manage the current data platform for the consistency on the performance and run mode with the necessary updates on day-to-day basis and do the necessary updates and compliance on uptime. Stakeholder Management: Ensure alignment by communicating product updates, roadmap changes, and strategic decisions to key stakeholders. Cultivate strong cross-functional relationships to foster collaboration and drive product success. Data Analysis & Insights: Leverage data analysis to track product performance, measure key metrics, and identify areas for continuous improvement. Machine Learning Expertise: Possess a solid understanding of machine learning concepts, algorithms, and applications to effectively guide the development of ML-powered products. Key Competencies: Highly driven and technically proficient program/delivery manager with proven experience. Excellent stakeholder management skills with an ability to work in a cross-functional environment Excellent planning, organizational, and time management skills. Strong communication and collaboration skills managing key stakeholders, vendors and partners in business and IT (Data). A proactive, make it happen temperament with operational excellence and agility Physical Demands (e.g. % travel): Travel may be need based Qualification: Bachelors degree in computer science, Engineering, Business, Statistics, or a related field. Master's degree preferred. 8+ years of product management experience focused on data analytics, business intelligence, or machine learning products. Proven track record of launching and managing data-driven products within cloud environments, specifically GCP (BigQuery, Data Studio, BQML, SQL, PowerBI). Exceptional analytical, problem-solving, communication, presentation, and interpersonal skills. Strong understanding of machine learning concepts, algorithms, and their application within the GCP AI/ML ecosystem. Experience with Agile development methodologies. A passion for the beauty and cosmetics industry.

Posted 2 weeks ago

Apply

7.0 - 12.0 years

0 - 0 Lacs

hyderabad, bangalore, mohali

Remote

We are seeking a highly skilled and experienced GCP Data Engineer with 7+ years of experience in data engineering and cloud technologies. The ideal candidate will have deep expertise in Google Cloud Platform (GCP), BigQuery, Python, and SQL. You will play a key role in designing and building scalable data pipelines and data solutions that power analytics and business intelligence. Key Responsibilities: Design, develop, and optimize data pipelines using GCP services, especially BigQuery, Cloud Storage, and Dataflow. Develop and maintain ETL/ELT processes using Python and SQL. Implement data models and schemas in BigQuery for efficient querying and storage. Collaborate with data scientists, analysts, and stakeholders to define data requirements and deliver robust data solutions. Monitor and troubleshoot data pipeline performance, quality, and reliability issues. Ensure best practices in data security, governance, and compliance on GCP. Automate data workflows and contribute to CI/CD pipeline integration for data solutions.

Posted 2 weeks ago

Apply

8.0 - 12.0 years

15 - 30 Lacs

Gurugram

Work from Office

Role description Lead and mentor a team of data engineers to design, develop, and maintain high-performance data pipelines and platforms. Architect scalable ETL/ELT processes, streaming pipelines, and data lake/warehouse solutions (e.g., Redshift, Snowflake, BigQuery). Own the roadmap and technical vision for the data engineering function, ensuring best practices in data modeling, governance, quality, and security. Drive adoption of modern data stack tools (e.g., Airflow, Kafka, Spark etc.) and foster a culture of continuous improvement. Ensure the platform is reliable, scalable, and cost-effective across batch and real-time use cases. Champion data observability, lineage, and privacy initiatives to ensure trust in data across the org. Skills Bachelors or Masters degree in Computer Science, Engineering, or related technical field. 8+ years of hands-on experience in data engineering with at least 2+ years in a leadership or managerial role. Proven experience with distributed data processing frameworks such as Apache Spark, Flink, or Kafka. Strong SQL skills and experience in data modeling, data warehousing, and schema design. Proficiency with cloud platforms (AWS/GCP/Azure) and their native data services (e.g., AWS Glue, Redshift, EMR, BigQuery). Solid grasp of data architecture, system design, and performance optimization at scale. Experience working in an agile development environment and managing sprint-based delivery cycles.

Posted 2 weeks ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Chennai

Work from Office

Job Summary Synechron is seeking an experienced Data Processing Engineer to lead the development of large-scale data processing solutions using Java, Apache Flink/Storm/Beam, and Google Cloud Platform (GCP). In this role, you will collaborate across teams to design, develop, and optimize data-intensive applications that support strategic business objectives. Your expertise will help evolve our data architecture, improve processing efficiency, and ensure the delivery of reliable, scalable solutions in an Agile environment. Software Requirements Required: Java (version 8 or higher) Apache Flink, Storm, or Beam for streaming data processing Google Cloud Platform (GCP) services, especially BigQuery and related data tools Experience with databases such as BigQuery, Oracle, or equivalent Familiarity with version control tools such as Git Preferred: Cloud deployment experience with GCP in particular Additional familiarity with containerization (Docker/Kubernetes) Knowledge of CI/CD pipelines and DevOps practices Overall Responsibilities Collaborate closely with cross-functional teams to understand data and system requirements, then design scalable solutions aligned with business needs. Develop detailed technical specifications, implementation plans, and documentation for new features and enhancements. Implement, test, and deploy data processing applications using Java and Apache Flink/Storm/Beam within GCP environments. Conduct code reviews to ensure quality, security, and maintainability, supporting team members' growth and best practices. Troubleshoot technical issues, resolve bottlenecks, and optimize application performance and resource utilization. Stay current with advancements in data processing, cloud technology, and Java development to continuously improve solutions. Support testing teams to verify data workflows and validation processes, ensuring reliability and accuracy. Participate in Agile ceremonies, including sprint planning, stand-ups, and retrospectives to ensure continuous delivery and process improvement. Technical Skills (By Category) Programming Languages: Required: Java (8+) Preferred: Python, Scala, or Node.js for scripting or auxiliary processing Databases/Data Management: Experience with BigQuery, Oracle, or similar relational data stores Cloud Technologies: GCP (BigQuery, Cloud Storage, Dataflow etc.) with hands-on experience in cloud data solutions Frameworks and Libraries: Apache Flink, Storm, or Beam for stream processing Java SDKs, APIs, and data integration libraries Development Tools and Methodologies: Git, Jenkins, JIRA, and Agile/Scrum practices Familiarity with containerization (Docker, Kubernetes) is a plus Security and Compliance: Understanding of data security principles in cloud environments Experience Requirements 4+ years of experience in software development, with a focus on data processing and Java-based backend development Proven experience working with Apache Flink, Storm, or Beam in production environments Strong background in managing large data workflows and pipeline optimization Experience with GCP data services and cloud-native development Demonstrated success in Agile projects, including collaboration with cross-functional teams Previous leadership or mentorship experience is a plus Day-to-Day Activities Design, develop, and deploy scalable data processing applications in Java using Flink/Storm/Beam on GCP Collaborate with data engineers, analysts, and architects to translate business needs into technical solutions Conduct code reviews, optimize data pipelines, and troubleshoot system issues swiftly Document technical specifications, data schemas, and process workflows Participate actively in Agile ceremonies, provide updates on task progress, and suggest process improvements Support continuous integration and deployment of data applications Mentor junior team members, sharing best practices and technical insights Qualifications Bachelors or Masters degree in Computer Science, Information Technology, or equivalent Relevant certifications in cloud technologies or data processing (preferred) Evidence of continuous professional development and staying current with industry trends Professional Competencies Strong analytical and problem-solving skills focused on data processing challenges Leadership abilities to guide, mentor, and develop team members Excellent communication skills for technical documentation and stakeholder engagement Adaptability to rapidly changing technologies and project priorities Capacity to prioritize tasks and manage time efficiently under tight deadlines Innovative mindset to leverage new tools and techniques for performance improvements

Posted 2 weeks ago

Apply

1.0 - 3.0 years

9 - 13 Lacs

Pune

Work from Office

Overview We are hiring an Associate Data Engineer to support our core data pipeline development efforts and gain hands-on experience with industry-grade tools like PySpark, Databricks, and cloud-based data warehouses. The ideal candidate is curious, detail-oriented, and eager to learn from senior engineers while contributing to the development and operationalization of critical data workflows. Responsibilities Assist in the development and maintenance of ETL/ELT pipelines using PySpark and Databricks under senior guidance. Support data ingestion, validation, and transformation tasks across Rating Modernization and Regulatory programs. Collaborate with team members to gather requirements and document technical solutions. Perform unit testing, data quality checks , and process monitoring activities. Contribute to the creation of stored procedures, functions, and views . Support troubleshooting of pipeline errors and validation issues. Qualifications Bachelor’s degree in Computer Science, Engineering, or related discipline. 3+ years of experience in data engineering or internships in data/analytics teams. Working knowledge of Python, SQL , and ideally PySpark . Understanding of cloud data platforms (Databricks, BigQuery, Azure/GCP). Strong problem-solving skills and eagerness to learn distributed data processing. Good verbal and written communication skills. What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com

Posted 2 weeks ago

Apply

4.0 - 8.0 years

6 - 13 Lacs

Mohali, Chandigarh, Panchkula

Work from Office

Job Title: Digital Marketing Coordinator * Must have good communication skills Pls note : Candidate should be flexible for shift timings. Initially It will be general shift ( 9:30 AM to 6:30 PM ( IST ) but as per requirement it can be changed from 2:30 PM to 11:30 PM (IST) We are seeking a self-driven, ambitious, and tech-savvy Digital Marketing Coordinator to join our growing team. In this role, you will play a pivotal part in driving revenue growth, customer acquisition, and brand awareness through data-backed digital strategies. You will be responsible for campaign execution, performance optimization, and reporting across multiple digital platforms. Key Responsibilities Own the end-to-end process of digital campaign execution, from planning and setup to performance tracking and reporting. Proactively troubleshoot tracking and reporting issues using tools like Google Tag Manager and Google Analytics. Develop, refine, and maintain visual performance dashboards in Looker Studio for internal stakeholders. Implement streamlined processes to ensure efficient, accurate, and scalable campaign measurement. Monitor campaign performance across Google Ads, Meta (Facebook/Instagram), and other digital platforms to optimize ROI and achieve KPIs. Identify emerging trends and customer insights, and translate them into actionable recommendations. Analyze sales funnels, web analytics, and behaviour trends to inform growth strategies. Research and implement new tools and technologies to support digital campaign execution and reporting. Report regularly on core marketing KPI,s including ROAS, CPL, conversion rates, website traffic, and engagement metrics. Requirements Strong hands-on experience with Google Analytics , Google Tag Manager , Looker Studio , BigQuery , and Google Ads . Proficiency in managing and optimizing Facebook Ads and Instagram campaigns . Solid understanding of digital marketing best practices, including SEO, SEM, Social Media Marketing, Email Marketing , and Social Media Optimization . Digital-first mindset with an ability to interpret data into strategic insights. Exceptional analytical and critical thinking skills with a results-driven attitude Ability to manage multiple campaigns and deadlines with minimal supervision. Strong verbal and written communication skills in English. A strong portfolio and/or track record of managing high-performance digital campaigns.

Posted 2 weeks ago

Apply

3.0 - 8.0 years

1 - 1 Lacs

Chennai

Hybrid

Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place one that benefits lives, communities and the planet Job Title: Specialty Development Senior Location: Chennai Work Type: Hybrid Position Description: We are global IT product team that implements and maintains globally used services for tax decision and reporting. Our products are based on SaaS cloud solutions from specialized vendors like Vertex and Edicom. Our team manages configuration, integration, and implementation of those solutions in the client environment using Informatica Cloud Middleware (IICS) and Python & BASH scripts running on Linux servers and in Google Cloud Platform. We need an experienced senior developer who independently develops software (Informatica Cloud Middleware) and Python / BASH scripts to deliver user stories that contribute to a valuable software product. He/she needs to perform, control and innovatively improve application development, deployment and testing standards for Informatica Cloud Middleware, plan, design and realize Data Integration, Application Integration and BPEL Service concepts in IICS and the Informatica Process Developer tool. We expect experience in software development, execution and evaluation of tests to confirm correct application functionality and to identify and fix software deficiencies Skills Required: ETL.Informatica, SOAP, Extensible Markup Language (XML), Linux, Python, SQL, Communications Skills Preferred: Big Query,, Agile Software Development, GitHub, Tekton, GCP Cloud Run Experience Required: At least 3-year experience in software development and maintenance in Informatica Cloud Middleware (IICS) Multi-year experience in other relevant technologies: Python, Linux, Google Cloud Platform (BigQuery, Cloud Run) Managing many items in parallel efficiently - Advanced communication skills in English (written and oral) Experience working in a global team Experience Required: 3 Years Experience Preferred: Experience with other tools relevant to software development and deployment, e.g. GitHub, Tekton Education Required: Bachelor's Degree TekWissen Group is an equal opportunity employer supporting workforce diversity.

Posted 2 weeks ago

Apply

7.0 - 12.0 years

22 - 27 Lacs

Noida, New Delhi, Gurugram

Hybrid

Ab Initio ETL Application Developer ETL developer with Ab initio experience for Data Warehouse and Data Mart applications within the HealthCare Insurance business. The position requires a strong proficiency in data integration (ETL) and involvement in all phases of application development. When you join the team, you are joining a team of elite, passionate software professionals that take pride in engineering excellence and creative solutions that add value to our stakeholders. You will have plenty of opportunities to showcase your talent, learn new technologies and have a rewarding and fulfilling career. Responsibilities • Develop complex programs from detailed technical specifications. • Designs, codes, tests, debugs, and documents those programs. Competent to work at the highest technical level of phases of applications systems analysis and programming activities. • Independently designs and/or codes the development of cost-effective application and program solutions. • Independently performs ongoing system maintenance, research, problem resolution and on-call support tasks for existing systems. • Is fully familiar and compliant with the efficient utilization of the prescribed methodologies and ensures compliance with all work performed. • Performs unit testing. May perform or assist with integration and system testing, according to detailed test plans to ensure high-quality systems. May assist business partners with User Acceptance Testing. • Responsible to follow all procedures and directions to ensure Code Asset Management for an application or set of applications. • Supports and promotes the reuse of assets across the organization. Required Qualifications • Expertise in one or more programming language, development tools, and/or databases and the systems development life cycle, applicable to development organization. • 7+ years of Ab initio and Data Warehousing in parallel processing environment required • 5+ years SQL experience, advanced SQL coding in an enterprise setting • Hadoop, Pig, Hive, Scope, HQL Experience • Strong Unix KShell scripting desired. • Ab initio and Data Warehousing coding experience, testing, and debugging experience required • Solid analytical and software development skills • Ability to optimize SQL coding for efficiency • GCP cloud technologies including Big Query Preferred Qualifications • Healthcare domain experience • ZEKE knowledge a plus • Working knowledge of mainframe and midrange environments • Experience with application development support software packages • Experience working in an Agile framework such as SAFe. • Affiliations with a technical or professional organization or user group

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies