Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3 - 5 years
9 - 13 Lacs
Bengaluru
Work from Office
About The Role Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters ? Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities ? 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders ? 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally ? Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries, Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Mandatory Skills: Google BigQuery. Experience3-5 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 2 months ago
3 - 6 years
4 - 8 Lacs
Bengaluru
Work from Office
About The Role Data engineers are responsible for building reliable and scalable data infrastructure that enables organizations to derive meaningful insights, make data-driven decisions, and unlock the value of their data assets. About The Role - Grade Specific The primary focus is to help organizations design, develop, and optimize their data infrastructure and systems. They help organizations enhance data processes, and leverage data effectively to drive business outcomes. Skills (competencies) Industry Standard Data Modeling (FSLDM) Ab Initio Industry Standard Data Modeling (IBM FSDM)) Agile (Software Development Framework) Influencing Apache Hadoop Informatica IICS AWS Airflow Inmon methodology AWS Athena JavaScript AWS Code Pipeline Jenkins AWS EFS Kimball AWS EMR Linux - Redhat AWS Redshift Negotiation AWS S3 Netezza Azure ADLS Gen2 NewSQL Azure Data Factory Oracle Exadata Azure Data Lake Storage Performance Tuning Azure Databricks Perl Azure Event Hub Platform Update Management Azure Stream Analytics Project Management Azure Sunapse PySpark Bitbucket Python Change Management R Client Centricity RDD Optimization Collaboration SantOs Continuous Integration and Continuous Delivery (CI/CD) SaS Data Architecture Patterns Scala Spark Data Format Analysis Shell Script Data Governance Snowflake Data Modeling SPARK Data Validation SPARK Code Optimization Data Vault Modeling SQL Database Schema Design Stakeholder Management Decision-Making Sun Solaris DevOps Synapse Dimensional Modeling Talend GCP Big Table Teradata GCP BigQuery Time Management GCP Cloud Storage Ubuntu GCP DataFlow Vendor Management GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2
Posted 2 months ago
- 2 years
2 - 4 Lacs
Gurugram
Remote
What does the team do? The Ad Operations team is responsible for setting up, managing, analysing and optimising digital advertising campaigns. They ensure ads are delivered correctly, track performance, and troubleshoot issues to maximise campaign effectiveness and revenue consumption. What Youll Do? Data Management: Gather, organize, and maintain data related to advertising campaigns and their revenue, ensuring accuracy and consistency. Querying the Database: Using SQL/ BigQuery to run queries on ShareChats analytical engine Scripting: Writing scalable scripts to fetch or modify data from API endpoints. Collaborate with data teams to ensure proper integration and flow of data between different systems and platforms. Reporting and Insights: Create reports and dashboards to visualize key performance metrics. Generate regular and ad-hoc reports that provide insights into monthly/quarterly/ annual revenue, campaign performance and key metrics. Communicate findings and insights to cross-functional teams, including AdOps, sales and management, to drive data-informed decision-making. Ad Strategy: Work with Strategy team with data insights to develop Go to Market strategies for Key Clients Monitor ad inventory levels and work with Strategy teams to ensure ad space is efficiently utilized. Assist in forecasting future ad inventory needs based on historical data. Identify opportunities for process improvements and automation in ad operations workflows. Contribute to the development and implementation of best practices and standard operating procedures for ad operations. Salesforce Administration, Integration and Automation: Configure, customize and maintain the Salesforce CRM system to meet the specific needs of the advertising team. Create and manage custom objects, fields, and workflows to support advertising operations. Integrate Salesforce with other advertising and marketing tools and platforms for seamless data flow. Automate routine tasks and processes to improve efficiency and reduce manual work. Who are you? BS in Mathematics, Economics, Computer Science, Information Management or Statistics is preferred. Proven working experience as a data analyst or business data analyst Strong knowledge of and experience with SQL/ BigQuery and Excel Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy Experience with Salesforce would be an advantage.
Posted 2 months ago
3 - 6 years
10 - 16 Lacs
Hyderabad
Hybrid
Requirements: Must have worked in a QA role for ETL/data transformation Minimum 3+ years of QA experience Strong SQL skills Proficiency in using BigQuery functions and operators for data comparison, aggregation, and validation across different stages of the transformation Should have analytical skills to understand complex requirements to do in depth data validation Should be able to work independently to create artifacts such as test strategy, test plan etc. Good understanding of data mapping and data requirements Inclined to do rigorous and repeatable testing with enthusiasm to find bugs Willing to do manual SQL QA work with complex queries Intuitive and able to work independently Strong communication skills are a must have Experience and desire to work in a Global delivery environment Location : Hyderabad Shift : 1.00 PM to 10.00 PM Notice Period : Immediate to 15 Days
Posted 2 months ago
5 - 10 years
30 - 45 Lacs
Bengaluru
Work from Office
Company Overview Lifesight is a fast growing SaaS company focused on helping businesses leverage data & AI to improve customer acquisition and retention. We have a team of 130 serving 300+ customers across 5 offices in the US, Singapore, India, Australia and the UK. Our mission is to make it easy for non-technical marketers to leverage advanced data activation and marketing measurement tools that are powered by AI, to improve their performance and achieve their KPIs. Our product is being adopted rapidly globally and we need the best people onboard the team to accelerate our growth. Position Overview The ideal candidate is a self-motivated, self managed and multi-tasker, and demonstrated team-player. You will be a lead developer responsible for the development of new software products and working on improving numerous non-functional requirements of the products as well. If youre looking to be a part of a dynamic, highly-analytical team and an opportunity to hone your java, cloud engineering and distributed system skills, look no further. As our Senior Software Engineer for the platform team, you will be handed the reins to build the core microservices. Along with building services to query billions of rows in Googles BigQuery, you will be in charge of building scalable APIs to build user segments and evolve architecture to send millions of notifications through varied streams like Email, SMS, in-app notifications per day. What youll do: Be responsible for the overall development of the modules/services that you will be working on Code, design, prototype, perform reviews and consult in the process of building highly scalable, reliable, and fault-tolerant systems. As our senior software engineer continuously refactor applications and architectures to maintain high-quality levels Continue to stay abreast of the latest technologies in distributed systems, caching and research new technologies, tools that enable building the next generation systems Act as an engineer that enjoys writing readable, concise, reusable, extensible code every day Discuss, articulate requirements with product management and scope, execute the feature road map Participate in teams hiring process by being a panelist in interviews Requirements What youll need: Ideally 7+ years of hands-on experience in designing, developing, testing, and deploying large scale applications, microservices in any language or stack (preferably java, Springboot) Good knowledge in one or more of these areas: Cloud, NoSQL stores; we use Google Cloud, Kubernetes, BigQuery, messaging systems Excellent attitude and passion working in a team with willingness to learn Experience in building low latency, high volume REST API requests handling Experience in working in distributed caches like Redis Ability to Get Stuff Done ! Bonus Points If you have.. Experience in containerization technologies like Docker, Kubernetes Experience working in any cloud platforms (preferably GCP) Experience in NoSQL stores (like Cassandra, Clickhouse, BigQuery) Benefits What is in it for the candidate : As a team, we are concerned with not only the growth of the company, but each other’s personal growth and well being too. Along with our desire to utilize smart technology and innovative engineering strategies to make people’s lives easier, our team also bonds over our shared love for all kinds of tea, movies & fun filled Friday’s events with a prioritizing healthy work-life balance. 1. Working for one of the fastest growing and successful MarTech companies in times 2. Opportunity to be part of an early member of the core team to build a product from scratch starting from making tech stack choices, driving and influencing the way to simplify building complex products. 3. Enjoy working in small teams and a non bureaucratic environment 4. Enjoy an environment that provides high levels of empowerment and space to achieve your Objectives and grow with organization. 5. Work in a highly profitable and growing organization, with opportunities to accelerate and shape your career 6. Great benefits - apart from competitive compensation & benefits 7. Above all - a “fun” working environment.
Posted 2 months ago
4 - 9 years
15 - 30 Lacs
Pune, Gurugram, Bengaluru
Hybrid
5+ experience in software development using C#, MSSQL and GCP/BigQuery. Good to have Python experience Contribute to the design and development of innovative software solutions that meet business requirements. Develop and maintain applications using specified technologies Participate in code reviews to ensure high-quality code and adherence to best practices. Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Experience in code reviews and maintaining code quality. Ability to mentor and guide junior developers. Bachelor's degree in Computer Science, Engineering, or a related field.
Posted 2 months ago
3 - 8 years
12 - 15 Lacs
Mumbai
Work from Office
Responsibilities: Develop and maintain data pipelines using GCP. Write and optimize queries in BigQuery. Utilize Python for data processing tasks. Manage and maintain SQL Server databases. Must-Have Skills: Experience with Google Cloud Platform (GCP). Proficiency in BigQuery query writing. Strong Python programming skills. Expertise in SQL Server. Good to Have: Knowledge of MLOps practices. Experience with Vertex AI. Background in data science. Familiarity with any data visualization tool.
Posted 2 months ago
2 - 4 years
3 - 8 Lacs
Kolkata
Remote
Data Quality Analyst Experience: 2 - 4 Years Exp Salary : Competitive Preferred Notice Period : Within 30 Days Shift : 10:00AM to 7:00PM IST Opportunity Type: Remote Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : Data Validation, BigQuery, SQL, Communication Skill Good to have skills : Data Visualisation, PowerBI, Tableau Forbes Advisor (One of Uplers' Clients) is Looking for: Data Quality Analyst who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description Short term objectives We know the importance data validation can play in creating better reporting for our business - we have identified areas we want you to make an impact within the first 3 months. Push 40% of partners through the ingestion validation process Push 40% of partners through the mapping validation process Data Team Culture Our team requires four areas of focus from every team member (see below). We use these focus areas to guide our decision making and career growth. To give you an idea of these requirements, the top three from each area are: Mastery: • Demonstrate skills expertise in relevant tool (e.g., GA, Tableau) or code language (e.g., SQL) • Think about the wider impact & value of decisions • Understand and anticipate the need for scalability, stability, and security Communication: • Provide clear, actionable feedback from peer reviews • Communicate effectively to wider teams and stakeholders • Proactively share knowledge everyday Ownership: • Lead complex initiatives that drive challenging goals • Create and push forward cross cutting concerns between teams • Demonstrate consistently sound judgement Behaviours: • Challenge yourself and others through questioning, assessing business benefits, and understanding cost of delay • Own your workload and decisions - show leadership to others • Innovate to find new solutions, or improve existing ways of working - push yourself to learn everyday Responsibilities: Reports directly to Senior Business Analyst and works closely with Data & Revenue Operations functions to support key deliverables Reconciliation of affiliate network revenue by vertical and publisher brand at monthly level Where discrepancies exist, investigation by to isolate whether specific days, products, providers, or commission values Validate new tickets going on to the Data Engineering JIRA board to ensure requests going into Data Engineering are complete, accurate and as descriptive as possible Investigation results to be updated into JIRA tickets and all outputs saved in mapping google sheet Use Postman API, Webhooks to pull revenue data from partner portals and verify against partner portals and BQ Monitor API failures, rate limits, and response inconsistencies impacting revenue ingestion. As necessary, seek revenue clarifications from the verticals RevOps team member As necessary, clarify JIRA commentary for data engineers Understand requirements, goals, priorities, and communicate to the stakeholders on progress towards data goals Ability to ensure outputs are on time and on target Required competencies: At least two (2) years of data quality analysis experience A strong understanding of SQL and how it can be used to validate data (experience with BigQuery is a plus) An understanding of large, relational databases and how to navigate these datasets to find the data required Ability to communicate data to non-technical audiences through the use of reports and visualisations Strong interpersonal and communication skills Comfortable working remotely and collaboratively with teammates across multiple geographies and time zones Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: Forbes Advisor is a global platform dedicated to helping consumers make the best financial choices for their individual lives. About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 2 months ago
7 - 10 years
8 - 14 Lacs
Patna
Work from Office
Role : Data Engineer We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration. - Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows.- Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms. - SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently.- Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs. - Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations.Skills and Qualifications :- Proven experience as a Data Engineer or similar role.- Strong knowledge of Oracle to BigQuery data warehouse migration and modernization. - Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem.- In-depth knowledge of Oracle DB and PL/SQL.- Excellent SQL writing skills.- Strong analytical and problem-solving abilities.- Ability to work collaboratively with cross-functional teams.- Excellent communication and interpersonal skills. Preferred Qualifications :- Experience with other data management tools and technologies.- Knowledge of cloud-based data solutions.- Certification in data engineering or related fields.
Posted 2 months ago
6 - 11 years
10 - 18 Lacs
Noida, Indore
Work from Office
Role & responsibilities Job Description: We are looking for GCP Data Engineer and SQL Programmer with good working experience on PostgreSQL, & PL/SQL programming experience and following technical skills PL/SQL and PostgreSQL programming Ability to write complex SQL Queries, Stored Procedures. Migration Working experience in migrating Database structure and data from Oracle to Postgres SQL preferably on GCP Alloy DB or Cloud SQL Working experience on Cloud SQL/Alloy DB Working experience to tune autovacuum in postgresql. Working experience on tuning Alloy DB / PostgreSQL for better performance. Working experience on Big Query, Fire Store, Memory Store, Spanner and bare metal setup for PostgreSQL Ability to tune the Alloy DB / Cloud SQL database for better performance Experience on GCP Data migration service Working experience on MongoDB Working experience on Cloud Dataflow Working experience on Database Disaster Recovery Working experience on Database Job scheduling Working experience on Database logging techniques Knowledge of OLTP And OLAP Desirable: GCP Database Engineer Certification Other Skills:- Out of the Box Thinking Problem Solving Skills Ability to make tech choices (build v/s buy) Performance management (profiling, benchmarking, testing, fixing) Enterprise Architecture Project management/Delivery Capabilty/ Quality Mindset Scope management Plan (phasing, critical path, risk identification) Schedule management / Estimations Leadership skills Other Soft Skills Learning ability Innovative / Initiative Preferred candidate profile Roles & Responsibilities: Develop, construct, test, and maintain data architectures Migrate Enterprise Oracle database from On Prem to GCP cloud autovacuum in postgresql Ability to tune autovacuum in postgresql. Working on tuning Alloy DB / PostgreSQL for better performance. Performance Tuning of PostgreSQL stored procedure code and queries Converting Oracle stored procedure & queries to PostgreSQL stored procedures & Queries Creating Hybrid data store with Datawarehouse and No SQL GCP solutions along with PostgreSQL. Migrate Oracle Table data from Oracle to Alloy DB Leading the database team Mandatory Skills: Postgresql, plsql, Bigquery Bottom of Form
Posted 2 months ago
5 - 10 years
0 - 3 Lacs
Hyderabad
Hybrid
Job Profile We are seeking a Senior Data Engineer with proven expertise in designing and maintaining scalable, efficient, and reliable data pipelines. The ideal candidate should have strong proficiency in SQL, DBT, BigQuery, Python, and Airflow, along with a solid foundation in data warehousing principles. In this role, you will be instrumental in managing and optimizing data workflows, ensuring high data quality, and supporting data-driven decision-making across the organization. Experience with Oracle ERP systems and knowledge of data migration to a data warehouse environment will be considered a valuable advantage. Years of Experience: 5 to 10 Years. Shift Timings: 1PM to 10PM IST. Skill Set • SQL: Advanced proficiency in writing optimized queries, working with complex joins, CTEs, window functions, etc. • DBT (Data Build Tool): Experience in modelling data with dbt, managing data transformations, and maintaining project structure. Python: Proficient in writing data processing scripts and building Airflow DAGs using Python. BigQuery: Strong experience with GCPs BigQuery, including dataset optimization, partitioning, and query cost management. Apache Airflow: Experience building and managing DAGs, handling dependencies, scheduling jobs, and error handling. Data Warehousing Concepts: Strong grasp of ETL/ELT, dimensional modelling (star/snowflake), fact/dimension tables, slowly changing dimensions, etc. Version Control: Familiarity with Git/GitHub for code collaboration and deployment. • Cloud Platforms: Working knowledge of Google Cloud Platform (GCP). Job Description Roles & Responsibilities: Design, build, and maintain robust ETL/ELT data pipelines using Python, Airflow, and DBT. Develop and manage dbt models to enable efficient, reusable, and well-documented data transformations. Collaborate with stakeholders to gather data requirements and design data marts comprising fact and dimension tables in a well-structured star schema. Manage and optimize data models and transformation logic in BigQuery, ensuring high performance and cost-efficiency. Implement and uphold robust data quality checks, logging, and alerting mechanisms to ensure reliable data delivery. Maintain the BigQuery data warehouse, including routine optimizations and updates. Enhance and support the data warehouse architecture, including the use of star/snowflake schemas, partitioning strategies, and data mart structures. Proactively monitor and troubleshoot production pipelines to minimize downtime and ensure data accuracy.
Posted 2 months ago
3 - 8 years
15 - 30 Lacs
Pune, Gurugram, Bengaluru
Hybrid
Salary: 15 to 30 LPA Exp: 3 to 8 years Location : Gurgaon/Bangalore/Pune/Chennai Notice: Immediate to 30 days..!! Key Responsibilities & Skillsets: Common Skillsets : 3+ years of experience in analytics, SAS Pyspark, Python, Spark, SQL and associated data engineering jobs. Must have experience with managing and transforming big data sets using pyspark, spark-scala, Numpy pandas Excellent communication & presentation skills Experience in managing Python codes and collaborating with customer on model evolution Good knowledge of data base management and Hadoop/Spark, SQL, HIVE, Python (expertise). Superior analytical and problem solving skills Should be able to work on a problem independently and prepare client ready deliverable with minimal or no supervision Good communication skill for client interaction Data Management Skillsets: Ability to understand data models and identify ETL optimization opportunities. Exposure to ETL tools is preferred Should have strong grasp of advanced SQL functionalities (joins, nested query, and procedures). Strong ability to translate functional specifications / requirements to technical requirements
Posted 2 months ago
7 - 10 years
8 - 14 Lacs
Pune
Work from Office
Role : Data Engineer We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration.- Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows. - Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms.- SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently. - Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs.- Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications :- Proven experience as a Data Engineer or similar role.- Strong knowledge of Oracle to BigQuery data warehouse migration and modernization.- Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem.- In-depth knowledge of Oracle DB and PL/SQL. - Excellent SQL writing skills.- Strong analytical and problem-solving abilities.- Ability to work collaboratively with cross-functional teams.- Excellent communication and interpersonal skills. Preferred Qualifications :- Experience with other data management tools and technologies.- Knowledge of cloud-based data solutions.- Certification in data engineering or related fields.
Posted 2 months ago
7 - 10 years
8 - 14 Lacs
Lucknow
Work from Office
We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration. - Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows. - Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms. - SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently. - Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs. - Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications : - Proven experience as a Data Engineer or similar role. - Strong knowledge of Oracle to BigQuery data warehouse migration and modernization. - Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. - In-depth knowledge of Oracle DB and PL/SQL. - Excellent SQL writing skills. - Strong analytical and problem-solving abilities. - Ability to work collaboratively with cross-functional teams. - Excellent communication and interpersonal skills. Preferred Qualifications : - Experience with other data management tools and technologies. - Knowledge of cloud-based data solutions. - Certification in data engineering or related fields.
Posted 2 months ago
7 - 10 years
8 - 14 Lacs
Bengaluru
Work from Office
We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration. - Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows. - Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms. - SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently. - Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs. - Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications : - Proven experience as a Data Engineer or similar role. - Strong knowledge of Oracle to BigQuery data warehouse migration and modernization. - Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. - In-depth knowledge of Oracle DB and PL/SQL. - Excellent SQL writing skills. - Strong analytical and problem-solving abilities. - Ability to work collaboratively with cross-functional teams. - Excellent communication and interpersonal skills. Preferred Qualifications : - Experience with other data management tools and technologies. - Knowledge of cloud-based data solutions. - Certification in data engineering or related fields.
Posted 2 months ago
5 - 9 years
18 - 20 Lacs
Noida
Work from Office
Experience: 5-7 Years Location-Noida Position: Data Analyst Technical Skills: Strong proficiency in Python (Pandas, NumPy, Matplotlib, Seaborn, etc.). Advanced SQL skills for querying large datasets. Experience with data visualization tools ( Looker, etc.). Hands-on experience with data wrangling, cleansing, and transformation. Familiarity with ETL processes and working with structured/unstructured data. Analytical & Business Skills: Strong problem-solving skills with the ability to interpret complex data. Business acumen to connect data insights with strategic decision-making. Excellent communication and presentation skills. Preferred (Nice to Have): Knowledge of machine learning concepts (scikit-learn, TensorFlow, etc.) Exposure to cloud platforms (GCP) for data processing.
Posted 2 months ago
4 - 9 years
15 - 19 Lacs
Pune
Work from Office
About The Role : Job Title: Technical-Specialist GCP Developer LocationPune, India Role Description This role is for Engineer who is responsible for design, development, and unit testing software applications. The candidate is expected to ensure good quality, maintainable, scalable, and high performing software applications getting delivered to users in an Agile development environment. Candidate / Applicant should be coming from a strong technological background. The candidate should have goo working experience in Spark and GCP technology. Should be hands on and be able to work independently requiring minimal technical/tool guidance. Should be able to technically guide and mentor junior resources in the team. As a developer you will bring extensive design and development skills to enforce the group of developers within the team. The candidate will extensively make use and apply Continuous Integration tools and practices in the context of Deutsche Banks digitalization journey. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy. Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Design and discuss your own solution for addressing user stories and tasks. Develop and unit-test, Integrate, deploy, maintain, and improve software. Perform peer code review. Actively participate into the sprint activities and ceremonies e.g., daily stand-up/scrum meeting, Sprint planning, retrospectives, etc. Apply continuous integration best practices in general (SCM, build automation, unit testing, dependency management) Collaborate with other team members to achieve the Sprint objectives. Report progress/update Agile team management tools (JIRA/Confluence) Manage individual task priorities and deliverables. Responsible for quality of solutions candidate / applicant provides. Contribute to planning and continuous improvement activities & support PO, ITAO, Developers and Scrum Master. Your skills and experience Engineer with Good development experience in Google Cloud platform for at least 4 years. Hands own experience in Bigquery, Dataproc, Composer, Terraform, GKE, Cloud SQL and Cloud functions. Experience in set-up, maintenance, and ongoing development of continuous build/ integration infrastructure as a part of Devops. Create and maintain fully automated CI build processes and write build and deployment scripts. Has experience with development platformsOpenShift/ Kubernetes/Docker configuration and deployment with DevOps tools e.g., GIT, TeamCity, Maven, SONAR Good Knowledge about the core SDLC processes and tools such as HP ALM, Jira, Service Now. Knowledge on working with APIs and microservices , integrating external and internal web services including SOAP, XML, REST, JSON . Strong analytical skills. Proficient communication skills. Fluent in English (written/verbal). Ability to work in virtual teams and in matrixed organizations. Excellent team player. Open minded and willing to learn business and technology. Keeps pace with technical innovation. Understands the relevant business area. Ability to share information, transfer knowledge to expertise the team members. How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs.
Posted 2 months ago
7 - 12 years
32 - 37 Lacs
Jaipur
Work from Office
About The Role : Job Title: Analytics Senior Analyst LocationJaipur, India Corporate TitleAVP Role Description You will be joining the Data & Analytics team as part of the Global Procurement division. The teams purpose is: Deliver trusted third-party data and insights to unlock commercial value and identify risk Develop and execute the Global Procurement Data Strategy Deliver the golden source of Global Procurement data, analysis and insights via dbPi, our Tableau self-service platform, leveraging automation and scalability on Google Cloud Provide data and analytical support to Global Procurement prioritised change initiatives The team leverages several tools and innovative techniques to create value added insights for stakeholders across end-to-end Procurement processes including, but not limited to, Third party Risk, Contracting, Spend, Performance Management, etc. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities You develop a sound understanding of the various tools and entire suite of analytical offerings on the standard procurement insights platform called dbPi. You support our stakeholders by understanding their requirements, challenge appropriately where needed in order to scope the porblem conceptualizing the optimum approach, and developing solutions using appropriate tools and visualisation techniques. You are comfortable leading small project teams in delivering the analytics change book of work, keeping internal and external stakeholders updated on the project progress while driving forward the key change topics For requests which are more complex in nature, you connect the dots and come up with a solution by establishing linkages across different systems and processes. You take end to end responsibility for any change request in the existing analytical product / dashboard starting from understanding the requirement, development, testing, QA and finally deliver it to stakeholders to their satisfaction. You are expected to deliver automation and Clean Data initiatives like deployment of Rules engine, Data quality checks enabled through Google cloud, bringing in Procurement data sources into GCP. You act as a thought partner to the Chief Information Offices deployment of Google Cloud Platform to migrate the data infrastructure layer (ETL processes) currently managed by Analytics team. You should be able to work in close collaboration with cross-functional teams, including developers, system administrators, and business stakeholders. Your skills and experience We are looking for talents with a Degree (or equivalent) in Engineering, Mathematics, Statistics, Sciences from an accredited college or university (or equivalent) to develop analytical solutions for our stakeholders to support strategic decision making. Any professional certification in Advanced Analytics, Data Visualisation and Data Science related domain is a plus. You have a natural curiosity for numbers and have strong quantitative & logical thinking skills. You ensure results are of high data quality and accuracy. You have working experience on Google Cloud and have worked with Cross functional teams to enable data source and process migration to GCP, you have working experience with SQL You are adaptable to emerging technologies like leveraging Machine Learning and AI to drive innovation. Procurement experience (useful --- though not essential) across vendor management, sourcing, risk, contracts and purchasing preferably within a Global and complex environment. You have the aptitude to understand stakeholders requirements, identify relevant data sources, integrate data, perform analysis and interpret the results by identifying trends and patterns. You enjoy the problem-solving process, think out of the box and break down a problem into its constituent parts with a view to developing end-to-end solution. You display enthusiasm to work in data analytics domain and strive for continuous learning and improvement of your technical and soft skills. You demonstrate working knowledge of different analytical tools like Tableau, Databases, Alteryx, Pentaho, Looker, Big Query in order to work with large datasets and derive insights for decision making. You enjoy working in a team and your language skills in English are convincing, making it easy for you to work in an international environment and with global, virtual teams How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs
Posted 2 months ago
10 - 15 years
25 - 40 Lacs
Pune
Work from Office
Introduction: We are seeking a highly skilled and experienced Google Cloud Platform (GCP) Solution Architect . As a Solution Architect, you will play a pivotal role in designing and implementing cloud-based solutions for our team using GCP. The ideal candidate will have a deep understanding of cloud architecture, a proven track record of delivering cloud-based solutions, and experience with GCP technologies. You will work closely with technical teams and clients to ensure the successful deployment and optimization of cloud solutions. Responsibilities: Lead the design and architecture of GCP-based solutions, ensuring scalability, security, performance, and cost-efficiency. Collaborate with business stakeholders, engineering teams, and clients to understand technical requirements and translate them into cloud-based solutions. Provide thought leadership and strategic guidance on cloud technologies, best practices, and industry trends. Design and implement cloud-native applications, data platforms, and microservices on GCP. Ensure cloud solutions are aligned with clients business goals and requirements, with a focus on automation and optimization. Conduct cloud assessments, identifying areas for improvement, migration strategies, and cost-saving opportunities. Oversee and manage the implementation of GCP solutions, ensuring seamless deployment and operational success. Create detailed documentation of cloud architecture, deployment processes, and operational guidelines. Engage in pre-sales activities, including solution design, proof of concepts (PoCs), and presenting GCP solutions to clients. Ensure compliance with security and regulatory requirements in the cloud environment. Requirements: At least 2+ years of experience as a Cloud Architect or in a similar role with strong expertise in Google Cloud Platform. In-depth knowledge of GCP services, including Compute Engine, Kubernetes Engine, BigQuery, Cloud Storage, Cloud Functions, and networking. Experience with infrastructure-as-code tools such as Terraform Strong understanding of cloud security, identity management, and compliance frameworks (e.g., GDPR, HIPAA). Hands-on experience with GCP networking, IAM, and logging/monitoring tools (Cloud Monitoring, Cloud Logging). Strong experience in designing and deploying highly available, fault-tolerant, and scalable solutions. Proficiency in programming languages like Java, Golang. Experience with containerization and orchestration technologies such as Docker, Kubernetes, and GKE (Google Kubernetes Engine). Experience in cloud cost management and optimization using GCP tools. Thanks, Pratap
Posted 2 months ago
7 - 10 years
8 - 14 Lacs
Mumbai
Work from Office
We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration. - Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows. - Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms. - SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently. - Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs. - Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications : - Proven experience as a Data Engineer or similar role. - Strong knowledge of Oracle to BigQuery data warehouse migration and modernization. - Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. - In-depth knowledge of Oracle DB and PL/SQL. - Excellent SQL writing skills. - Strong analytical and problem-solving abilities. - Ability to work collaboratively with cross-functional teams. - Excellent communication and interpersonal skills. Preferred Qualifications : - Experience with other data management tools and technologies. - Knowledge of cloud-based data solutions. - Certification in data engineering or related fields.
Posted 2 months ago
7 - 10 years
8 - 14 Lacs
Surat
Work from Office
We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration. - Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows. - Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms. - SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently. - Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs. - Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications : - Proven experience as a Data Engineer or similar role. - Strong knowledge of Oracle to BigQuery data warehouse migration and modernization. - Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. - In-depth knowledge of Oracle DB and PL/SQL. - Excellent SQL writing skills. - Strong analytical and problem-solving abilities. - Ability to work collaboratively with cross-functional teams. - Excellent communication and interpersonal skills. Preferred Qualifications : - Experience with other data management tools and technologies. - Knowledge of cloud-based data solutions. - Certification in data engineering or related fields.
Posted 2 months ago
7 - 10 years
8 - 14 Lacs
Kanpur
Work from Office
Role : Data EngineerWe are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities :- Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration.- Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows.- Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms.- SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently.- Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs.- Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications :- Proven experience as a Data Engineer or similar role.- Strong knowledge of Oracle to BigQuery data warehouse migration and modernization.- Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem.- In-depth knowledge of Oracle DB and PL/SQL.- Excellent SQL writing skills.- Strong analytical and problem-solving abilities.- Ability to work collaboratively with cross-functional teams.- Excellent communication and interpersonal skills. Preferred Qualifications :- Experience with other data management tools and technologies.- Knowledge of cloud-based data solutions.- Certification in data engineering or related fields.
Posted 2 months ago
7 - 10 years
8 - 14 Lacs
Hyderabad
Work from Office
Role : Data EngineerWe are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities :- Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration.- Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows.- Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms.- SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently.- Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs.- Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications :- Proven experience as a Data Engineer or similar role.- Strong knowledge of Oracle to BigQuery data warehouse migration and modernization.- Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem.- In-depth knowledge of Oracle DB and PL/SQL.- Excellent SQL writing skills.- Strong analytical and problem-solving abilities.- Ability to work collaboratively with cross-functional teams.- Excellent communication and interpersonal skills.Preferred Qualifications :- Experience with other data management tools and technologies.- Knowledge of cloud-based data solutions.- Certification in data engineering or related fields.
Posted 2 months ago
5 - 10 years
20 - 35 Lacs
Bengaluru
Hybrid
GCP Data Engineer - 5+ Years of experience - GCP (all services needed for Big Data pipelines like BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, App Engine), Spark, Scala, Hadoop - Python, PySpark, Orchestration (Airflow), SQL CI/CD (experience with Deployment pipelines) Architecture and Design of cloud-based Big Data pipelines and exposure to any ETL tools Nice to Have - GCP certifications
Posted 2 months ago
3 - 5 years
9 - 18 Lacs
Chennai
Hybrid
Role & responsibilities As a part of Tonik Data Analytics team candidate will be responsible for Develop and enhance the Data Lake Framework for ingestion of data from various sources and providing reports to downstream systems/users. Work with the different Tonik IT Teams to implement the data requirements. Develop Data Pipelines based on requirement on Key GCP Services i.e. BigQuery, Airflow, GCS using Python/SQL language. Ensure the proper GCP Standards are followed for the implementation. Preferred candidate profile Handson experience in any one of the Programming languages (Python, Java). Working experience in cloud platform (AWS/GCP/Azure). Experience in Design pattern & designing scalable solutions Handson experience in SQL and able to convert the requirements to a standard, scalable, cost effective and with better performance. Should be aware of the Data Engineering principals and Data pipeline techniques. Communicate effectively with stakeholders and other team members. Implemented E2E automated pipeline to ingest the data from different formats (csv, fixed width, json etc.) Closely work with various business teams to design and implement modern, scalable data solutions using a range of new and emerging technologies from the Google Cloud Platform. Work with Agile and implementation approaches in the delivery. Experienced and should have hands-on experience in the following key offerings from GCP or equivalent services from AWS. Composer/Airflow, BigQuery, Dataflow, Cloud Storage, Apache Beam, Data Proc Should have good understanding on the Security related configuration and arrangements in BigQuery and how to handle the data securely while sharing. Nice to have exposure/knowledge on the ML and pipelines
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough