Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 6.0 years
6 - 10 Lacs
pune
Work from Office
You bring systems design experience with the ability to architect and explain complex systems interactions, data flows, common interfaces and APIs. You bring a deep understanding of and experience with software development and programming languages such as Java/Kotlin, and Shell scripting. You have hands-on experience with the following technologies as a senior software developer: Java/Kotlin, Spring, Spring Boot, Wiremock, Docker, Terraform, GCP services (Kubernetes, CloudSQL, PubSub, Storage, Logging, Dashboards), Oracle & amp; Postgres, SQL, PgWeb, Git, Github & amp; Github Actions, GCP Professional Data Engineering certification Data Pipeline Development: Designing, implementing, and optimizing data pipelines on GCP using PySpark for efficient and scalable data processing. ETL Workflow Development: Building and maintaining ETL workflows for extracting, transforming, and loading data into various GCP services. GCP Service Utilization: Leveraging GCP services like BigQuery, Cloud Storage, Dataflow, and Dataproc for data storage, processing, and analysis. Data Transformation: Utilizing PySpark for data manipulation, cleansing, enrichment, and validation. Performance Optimization: Ensuring the performance and scalability of data processing jobs on GCP. Collaboration: Working with data scientists, analysts, and other stakeholders to understand data requirements and translate them into technical solutions. Data Quality and Governance: Implementing and maintaining data quality standards, security measures, and compliance with data governance policies on GCP. Troubleshooting and Support: Diagnosing and resolving issues related to data pipelines and infrastructure. Staying Updated: Keeping abreast of the latest GCP services, PySpark features, and best practices in data engineering. Required Skills: GCP Expertise: Strong understanding of GCP services like BigQuery, Cloud Storage, Dataflow, and Dataproc. PySpark Proficiency: Demonstrated experience in using PySpark for data processing, transformation, and analysis. Python Programming: Solid Python programming skills for data manipulation and scripting. Data Modeling and ETL: Experience with data modeling, ETL processes, and data warehousing concepts. SQL: Proficiency in SQL for querying and manipulating data in relational databases. Big Data Concepts: Understanding of big data principles and distributed computing concepts. Communication and Collaboration: Ability to effectively communicate technical solutions and collaborate with cross-functional teams
Posted -1 days ago
6.0 - 11.0 years
4 - 8 Lacs
bengaluru
Work from Office
The Senior Applications Developer provides input and support for, and performs full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). You will participate in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements. This position provides input to applications development project plans and integrations. You will collaborate with teams and supports emerging technologies to ensure effective communication and achievement of objectives. The Senior Applications Developer provides knowledge and support for applications development, integration, and maintenance.You willprovide input to department and project teams on decisions supporting projects. Apply Disaster Recovery Knowledge Apply Information Analysis and Solution Generation Knowledge Apply Information Systems Knowledge Apply Internal Systems Knowledge IT Design/Develop Application Solutions IT Knowledge of Emerging Technology IT Problem Management/Planning Technical Problem Solving and Analytical Processes Technical Writing Job Requirements: Contribute to IS Projects; Conducts systems and requirements analyses to identify project action items. Perform Analysis and Design; participates in defining and developing technical specifications to meet systems requirements. Design and Develop Moderate to Highly Complex Applications; Analyzes, designs, codes, tests, corrects, and documents moderate to highly complex programs to ensure optimal performance and compliance. Develop Application Documentation; Develops and maintains system documentation to ensure accuracy and consistency. Produce Integration Builds; Defines and produces integration builds to create applications. Performs Maintenance and Support; Defines and administers procedures to monitor systems performance and integrity. Support Emerging Technologies and Products; Monitors the industry to gain knowledge and understanding of emerging technologies. Must have GCP and Big Query experience Should have Power BI, Microservice Architecture, SQL Server, DB2, Spring Boot, JSON, Java, C#, AMQP, AzureAD, HTTP, readme documentation. Should be proficient in GIT, Scrum, and Azure DevOps Basic qualifications: 6+ years of experience with Java, including building complex, scalable applications. 6+ years of experience in Spring Boot, including designing and implementing advanced microservices architectures. 4+ years of GCP and Big Query experience Ideal Mindset: Lifelong Learner. You are always seeking to improve your technical and nontechnical skills. Team Player. You are someone who wants to see everyone on the team succeed and is willing to go the extra mile to help a teammate in need. Communicator. You know how to communicate your design ideas to both technical and nontechnical stakeholders, prioritizing critical information and leaving out extraneous details. Include if in India: Please note Shift Timing Requirement: 1:30pm IST -10:30 pm IST
Posted -1 days ago
2.0 - 6.0 years
3 - 7 Lacs
bengaluru
Remote
Job Requirements: Quality Assurance Automation Tester responsible for designing, developing, and executing automated tests to ensure software quality and performance. The ideal candidate combines strong technical skills in automation frameworks and scripting with a solid understanding of testing methodologies and agile processes. Certifications and experience with CI/CD tools, API testing, and cloud platforms are highly desirable. Technical Expertise Proficiency in automation tools such as Selenium or JMeter.Strong scripting skills in Java, Python, or JavaScript.Experience with test management tools like ADO.Familiarity with CI/CD tools (e.g., Jenkins, GitLab CI/CD) and version control systems. Experience planning, designing, and implementing testing strategies and automation infrastructure for large-scale system software. This role requires proficiency in GCP and Big Query, along with a solid understanding of various testing frameworks and tools. Proficiency in automated testing tools: Selenium, .NET, J-Unit, N-Unit, JMeter, Jenkins, SonarQube, Zap/OWASP. Experience with GCP and Big Query. Strong understanding of CI/CD practices and tools. Familiarity with scripting languages. Knowledge of security testing tools like Zap/OWASP and NetSparker. Experience with API testing using Swagger and Postman. Proficiency in GIT, Scrum, and Azure DevOps. Preferred Qualifications: Experience with backend development. Familiarity with Power BI, Microservice Architecture, SQL Server, DB2, Spring Boot, JSON, Java, C#, AMQP, AzureAD, HTTP. Experience with readme documentation. Ideal Mindset: Lifelong Learner. You are always seeking to improve your technical and nontechnical skills. Team Player. You are someone who wants to see everyone on the team succeed and is willing to go the extra mile to help a teammate in need. Communicator. You know how to communicate your design ideas to both technical and nontechnical stakeholders, prioritizing critical information and leaving out extraneous details. Please note Shift Timing Requirement: 1:30pm IST -10:30 pm IST
Posted -1 days ago
4.0 - 9.0 years
14 - 18 Lacs
noida
Work from Office
Job Role We are looking to hire ~3-5 talented Full stack developers with Python or Nodejs/Typescript background to work on building Gen AI solutions. The ideal candidate should have 4-10 years of practical work experience in areas like - integrating/managing APIs, async programming frameworks/libraries, state management , concurrency , containerization and telemetry. The ideal candidate who has the desire to work on Gen AI projects would have enrolled themselves in some Gen AI courses, and should have done some reading/exploration by themselves. Mandatory Competencies Data Science and Machine Learning - Data Science and Machine Learning - Gen AI Data Science and Machine Learning - Data Science and Machine Learning - Python Beh - Communication User Interface - node.JS - node.JS User Interface - Typescript - Typescript
Posted -1 days ago
8.0 - 11.0 years
9 - 13 Lacs
noida
Work from Office
Data Testing Strategy & Execution: Design, develop, and execute comprehensive test plans and test cases for data-centric applications, ETL processes, data warehouses, data lakes, and reporting solutions. SQL-Driven Validation: Utilize advanced SQL queries to perform complex data validation, data reconciliation, data integrity checks, and data quality assurance across various financial data sources. ETL Testing: Conduct thorough testing of ETL (Extract, Transform, Load) processes, ensuring data is accurately extracted, transformed according to business rules, and loaded correctly into target systems. Data Quality Assurance: Implement and monitor data quality checks, identify data discrepancies, anomalies, and inconsistencies, and work with development and business teams to resolve issues. Performance Testing (Data Focus): Contribute to performance testing efforts for data pipelines and database operations, ensuring optimal query and data load performance. Test Data Management: Create and manage robust test data sets for various testing phases, including positive, negative, and edge case scenarios. Defect Management: Identify, document, track, and re-test defects in data, collaborating closely with development and data engineering teams for timely resolution. Documentation & Reporting: Maintain clear and concise documentation of test plans, test cases, test results, and data quality reports. Provide regular status updates to stakeholders. Collaboration: Work effectively with business analysts, data architects, data engineers, and project managers to understand data flows, business requirements, and ensure data quality standards are met. Process Improvement: Proactively identify opportunities for process improvements in data testing methodologies and tools. Global Team Collaboration: Provide consistent overlap with EST working hours (until noon EST) to facilitate effective communication and collaboration with US-based teams. Required Skills & Experience: Experience: 8-11 years of hands-on experience in Data Quality Assurance, Data Testing, or ETL Testing roles. SQL Expertise: o Advanced proficiency in SQL: Ability to write complex queries, subqueries, analytical functions (Window functions), CTEs, and stored procedures for data validation, reconciliation, and analysis. o Experience with various SQL databases (e.g., SQL Server, Oracle, PostgreSQL, MySQL, Snowflake, BigQuery). o Strong understanding of database concepts: normalization, indexing, primary/foreign keys, and data types. Data Testing Methodologies: Solid understanding of data warehousing concepts, ETL processes, and various data testing strategies (e.g., source-to-target mapping validation, data transformation testing, data load testing, data completeness, data accuracy). Domain Expertise: o Strong understanding and proven experience in Risk and Finance IT domain: Familiarity with financial data (e.g., trading data, market data, risk metrics, accounting data, regulatory reporting). o Knowledge of financial products, regulations, and risk management concepts. Analytical & Problem-Solving Skills: Excellent ability to analyze complex data sets, identify root causes of data issues, and propose effective solutions. Communication: Strong verbal and written communication skills to articulate data issues and collaborate with diverse teams. Time Zone Overlap: Proven ability and willingness to provide consistent working hour overlap until noon EST to collaborate with teams in the Eastern Standard Time zone. Mandatory Competencies ETL - ETL - Tester Database - Database Programming - SQL Database - PostgreSQL - PostgreSQL Beh - Communication Database - Oracle - Database Design FS Domain - Market Risk - Risk Factor Sensitivity Analysis FS Domain - Financial Services - Financial Services
Posted -1 days ago
3.0 - 8.0 years
10 - 18 Lacs
chandigarh
Work from Office
Design and implement scalable data architectures to optimize data flow and analytics capabilities. Develop ETL pipelines, data warehouses, and real-time data processing systems. Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery. Work closely with data scientists to enhance machine learning models with structured and unstructured data. Prior experience in handling large-scale datasets is preferred. Mandatory Key Skills Data analytics,ETL,SQL,Python,Google Big Query,AWS Redshift,Data architecture*
Posted -1 days ago
5.0 - 10.0 years
5 - 9 Lacs
bengaluru
Work from Office
Job Summary This position provides input, support, and performs full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). Participates in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements. This position provides input to applications development project plans and integrations. Collaborates with teams and supports emerging technologies to ensure effective communication and achievement of objectives. This position provides knowledge and support for applications development, integration, and maintenance. Provides input to department and project teams on decisions supporting projects. Responsibilities: Performs systems analysis and design. Designs and develops moderate to highly complex applications. Develops application documentation. Produces integration builds. Performs maintenance and support. Supports emerging technologies and products. At least 5 yrs of experience Mandatory Skills: GIT, Scrum, Azure DevOps. GCP, Big Query, Power BI, Microservice. Architecture: SQL Server, DB2, Spring Boot, JSON, Java, C#, AMQP, AzureAD, HTTP, readme documentation Other Qualifications: Understanding of Agile Development Strong written and verbal communication skills Ability to work in a team environment. Familiarity with Accountability, being detail-oriented, and taking initiative Excellent written and verbal communications skills Bachelors degree in computer science or related discipline, or the equivalent in education and work experience
Posted -1 days ago
8.0 - 13.0 years
5 - 9 Lacs
bengaluru
Work from Office
Install, configure, and maintain database systems (e.g., Oracle, SQL Server, MySQL, PostgreSQL, or cloud-based databases). Monitor database performance and proactively tune queries, indexing, and configurations to ensure optimal efficiency. Manage database security, including role-based access control, encryption, and auditing. Oversee backup, recovery, high availability (HA), and disaster recovery (DR) strategies. Perform database upgrades, patching, migrations, and capacity planning. Collaborate with developers to optimize queries, stored procedures, and schema design. Automate routine tasks using scripts (Python, Bash, PowerShell, etc.). Implement and manage replication, clustering, and failover solutions. Maintain documentation of database configurations, policies, and procedures. Stay updated with emerging technologies, best practices, and compliance requirements. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Bachelor’s degree in Computer Science, Information Technology, or related field (or equivalent experience). 8+ years of hands-on experience as a Database Administrator. Strong expertise in at least one major RDBMS (Oracle, SQL Server, PostgreSQL, MySQL). Proficiency in performance tuning, query optimization, and troubleshooting. Experience in exadata and rac setup. Experience with HA/DR solutions such as Always On, Patroni, Data Guard, or replication technologies. Should have fundamental knowledge of linux/unix and different kinds of storages. Knowledge of cloud database services (AWS RDS/Redshift, Azure SQL, Google Cloud SQL/BigQuery). Solid understanding of data security, compliance standards (GDPR, HIPAA, PCI-DSS). Strong scripting and automation skills (Python, Shell, PowerShell, etc.). Should be able to perform multi region replication and create database images in containers. Excellent analytical, problem-solving, and communication skills. Preferred technical and professional experience Certifications such as Oracle Certified Professional (OCP), Microsoft CertifiedAzure Database Administrator, AWS Certified Database Specialty Familiarity with DevOps practices, CI/CD pipelines, and database version control tools (Liquibase, Flyway). Exposure to big data technologies (Hadoop, Spark).
Posted -1 days ago
6.0 - 11.0 years
6 - 11 Lacs
bengaluru
Work from Office
How We Will Help You: Joining our Java practice is not only a job, but a chance to grow your career. We will make sure to equip you with the skills you need to produce robust applications that you can be proud of. Whether it is providing you with training on a new programming language or helping you get certified in a new technology, we will help you grow your skills so you can continue to deliver increasingly valuable work. Once You Are Here, You Will: The Lead Applications Developer provides leadership in full systems life cycle management (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.) to ensure delivery is on time and within budget. You will direct component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements and ensure compliance. This position develops and leads AD project activities and integrations. The Lead Applications Developer guides teams to ensure effective communication and achievement of objectives. This position provides knowledge and support for applications development, integration, and maintenance. The Lead Applications Developer will lead junior team members with project related activities and tasks. You will guide and influence department and project teams. This position facilitates collaboration with stakeholders. Apply Disaster Recovery Knowledge Apply Foundation Architecture Knowledge Apply Information Analysis and Solution Generation Knowledge Apply Information Systems Knowledge Apply Internal Systems Knowledge Assess Business Needs IT Design/Develop Application Solutions IT Knowledge of Emerging Technology IT Process, Methods, and Tools IT Stakeholder Relationship Management Project Risk Management Problem Management and Project Planning Technical Problem Solving and Analytical Processes Technical Writing Job Requirements: Lead IS Projects; delegate work assignments to complete the deliverables for small projects or components of larger projects to meet project plan requirements Lead System Analysis and Design; Translates business and functional requirements into technical design to meet stated business needs. Leads Design and Development of Applications; Identify new areas for process improvements to enhance performance results. Deliver application solutions to meet business and non-functional requirements. Develop and Ensure Creation of Application Documentation; determines documentation needs to deliver applications Define and Produce Integration Builds; lead build processes for target environments to create software. Verifies integration test specifications to ensure proper testing. Monitor Emerging Technology Trends; monitor the industry to gain knowledge and understanding of emerging technologies. Lead Maintenance and Support; drives problem resolution to identify, recommend, and implement process improvements. Lead other Team Members; provides input to people processes (e.g., Quality Performance Review Career Development, Training, Staffing, etc.) to provide detailed performance level information to managers. Must have GCP and Big Query experience. Should have: Power BI, Microservice Architecture, SQL Server, DB2, Spring Boot, JSON, Java, C#, AMQP, AzureAD, HTTP, readme documentation. Basic qualifications: 6+ years of experience with Java, including building complex, scalable applications. 6+ years of experience in Spring Boot, including designing and implementing advanced microservices architectures. 4+ years of GCP and Big Query experience Ideal Mindset: Lifelong Learner. You are always seeking to improve your technical and nontechnical skills. Team Player. You are someone who wants to see everyone on the team succeed and is willing to go the extra mile to help a teammate in need. Communicator. You know how to communicate your design ideas to both technical and nontechnical stakeholders, prioritizing critical information and leaving out extraneous details. Please note Shift Timing Requirement: 1:30pm IST -10:30 pm IST
Posted -1 days ago
6.0 - 11.0 years
4 - 8 Lacs
bengaluru
Work from Office
The Senior Applications Developer provides input and support for, and performs full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). You will participate in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements. This position provides input to applications development project plans and integrations. You will collaborate with teams and supports emerging technologies to ensure effective communication and achievement of objectives. The Senior Applications Developer provides knowledge and support for applications development, integration, and maintenance.You willprovide input to department and project teams on decisions supporting projects. Apply Disaster Recovery Knowledge Apply Information Analysis and Solution Generation Knowledge Apply Information Systems Knowledge Apply Internal Systems Knowledge IT Design/Develop Application Solutions IT Knowledge of Emerging Technology IT Problem Management/Planning Technical Problem Solving and Analytical Processes Technical Writing Job Requirements: Contribute to IS Projects; Conducts systems and requirements analyses to identify project action items. Perform Analysis and Design; participates in defining and developing technical specifications to meet systems requirements. Design and Develop Moderate to Highly Complex Applications; Analyzes, designs, codes, tests, corrects, and documents moderate to highly complex programs to ensure optimal performance and compliance. Develop Application Documentation; Develops and maintains system documentation to ensure accuracy and consistency. Produce Integration Builds; Defines and produces integration builds to create applications. Performs Maintenance and Support; Defines and administers procedures to monitor systems performance and integrity. Support Emerging Technologies and Products; Monitors the industry to gain knowledge and understanding of emerging technologies. Must have GCP and Big Query experience Should have Power BI, Microservice Architecture, SQL Server, DB2, Spring Boot, JSON, Java, C#, AMQP, AzureAD, HTTP, readme documentation. Should be proficient in GIT, Scrum, and Azure DevOps Basic qualifications: 6+ years of experience with Java, including building complex, scalable applications. 6+ years of experience in Spring Boot, including designing and implementing advanced microservices architectures. 4+ years of GCP and Big Query experience Ideal Mindset: Lifelong Learner. You are always seeking to improve your technical and nontechnical skills. Team Player. You are someone who wants to see everyone on the team succeed and is willing to go the extra mile to help a teammate in need. Communicator. You know how to communicate your design ideas to both technical and nontechnical stakeholders, prioritizing critical information and leaving out extraneous details.
Posted -1 days ago
2.0 - 5.0 years
10 - 14 Lacs
bengaluru
Work from Office
We are currently seeking a Business Consulting Snr. Consultant to join our team in Bangalore, Karntaka (IN-KA), India (IN). Backend Consultant All positions are for back-end developers including lead developer must have GCP and Big Query experience. No front end skills needed. Required Skills for Lead and mid-level backend developers: GIT, Scrum, Azure DevOps. GCP , Big Query, Power BI, Microservice. Architecture: SQL Server, DB2, Spring Boot, JSON, Java, C#, AMQP, AzureAD, HTTP, readme documentation Should have knowledge of APIs like Swagger, Postman
Posted -1 days ago
3.0 - 8.0 years
10 - 18 Lacs
varanasi
Work from Office
Design and implement scalable data architectures to optimize data flow and analytics capabilities Develop ETL pipelines, data warehouses, and real-time data processing systems Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery Work closely with data scientists to enhance machine learning models with structured and unstructured data Prior experience in handling large-scale datasets is preferred Mandatory Key SkillsETL pipelines,data warehouses,SQL,Python,AWS Redshift,Google BigQuery,ETL*
Posted -1 days ago
3.0 - 8.0 years
10 - 18 Lacs
coimbatore
Work from Office
Design and implement scalable data architectures to optimize data flow and analytics capabilities Develop ETL pipelines, data warehouses, and real-time data processing systems Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery Work closely with data scientists to enhance machine learning models with structured and unstructured data Prior experience in handling large-scale datasets is preferred Mandatory Key SkillsPython,AWS Redshift,Google BigQuery,ETL pipelines,data warehousing,data architectures,SQL*
Posted -1 days ago
3.0 - 8.0 years
10 - 18 Lacs
mysuru
Work from Office
Design and implement scalable data architectures to optimize data flow and analytics capabilities. Develop ETL pipelines, data warehouses, and real-time data processing systems. Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery. Work closely with data scientists to enhance machine learning models with structured and unstructured data. Prior experience in handling large-scale datasets is preferred. Mandatory Key SkillsData analytics,ETL,SQL,Python,Google Big Query,AWS Redshift,Data architecture*
Posted -1 days ago
3.0 - 8.0 years
10 - 18 Lacs
kanpur
Work from Office
Design and implement scalable data architectures to optimize data flow and analytics capabilities Develop ETL pipelines, data warehouses, and real-time data processing systems Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery Work closely with data scientists to enhance machine learning models with structured and unstructured data Prior experience in handling large-scale datasets is preferred Mandatory Key Skillspython,data warehousing,etl,amazon redshift,bigquery,data engineering,data architecture,aws,machine learning,data flow,etl pipelines,real-time data processing,java,spring boot,microservices,spark,kafka,cassandra,scala,nosql,mongodb,rest,redis,SQL*
Posted -1 days ago
3.0 - 8.0 years
10 - 18 Lacs
nagpur
Work from Office
Design and implement scalable data architectures to optimize data flow and analytics capabilities. Develop ETL pipelines, data warehouses, and real-time data processing systems. Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery. Work closely with data scientists to enhance machine learning models with structured and unstructured data. Prior experience in handling large-scale datasets is preferred. Mandatory Key SkillsSQL,Python,data warehousing,etl,amazon redshift,bigquery,data engineering,AWS,machine learning,real-time data processing,Java,spring boot,microservices,spark,Kafka,Cassandra,Scala,NoSQL,mongodb,Redis,data architecture*
Posted -1 days ago
4.0 - 8.0 years
12 - 22 Lacs
chennai
Work from Office
Role & responsibilities Job Summary: As a GCP Data Engineer you will be responsible for developing, optimizing, and maintaining data pipelines and infrastructure. Your expertise in SQL and Python will be instrumental in managing and transforming data, while your familiarity with cloud technologies will be considered an asset as we explore opportunities to enhance data engineering processes. Job Description: Building scalable Data Pipelines Design, implement, and maintain end-to-end data pipelines to efficiently extract, transform, and load (ETL) data from diverse sources. Ensure data pipelines are reliable, scalable, and performance oriented. SQL Expertise: Write and optimize complex SQL queries for data extraction, transformation, and reporting. Collaborate with analysts and data scientists to provide structured data for analysis. Cloud Platform Experience: Utilize cloud services to enhance data processing and storage capabilities. Work towards the integration of tools into the data ecosystem. Documentation and Collaboration: Document data pipelines, procedures, and best practices to facilitate knowledge sharing. Collaborate closely with cross-functional teams to understand data requirements and deliver solutions. Required skills: 4+ years of experience with SQL, Python, 4+ GCP BigQuery, DataFlow, GCS, Dataproc. 4+ years of experience building out data pipelines from scratch in a highly distributed and fault-tolerant manner. Comfortable with a broad array of relational and non-relational databases. Proven track record of building applications in a data-focused role (Cloud and Traditional Data Warehouse) Experience with CloudSQL, Cloud Functions and Pub/Sub, Cloud Composer etc., Inquisitive, proactive, and interested in learning new tools and techniques. Familiarity with big data and machine learning tools and platforms. Comfortable with open source technologies including Apache Spark, Hadoop, Kafka. Strong oral, written and interpersonal communication skills Comfortable working in a dynamic environment where problems are not always well-defined. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status.
Posted -1 days ago
5.0 - 8.0 years
20 - 32 Lacs
hyderabad, pune, bengaluru
Work from Office
We are seeking a highly skilled Senior Cloud Native Developer with expertise in Google Cloud Platform (GCP) to join our dynamic team. In this role, the successful candidate will design, develop, and deploy innovative cloud-based solutions, ensuring they meet both functional and non-functional requirements while adhering to security and architectural standards. Responsibilities Design, develop, and deploy cloud-based solutions using GCP Comply with cloud architecture standards and best practices Engage in hands-on coding with Java Applications using GCP Native Services like GKE, CloudRun, and more Choose appropriate GCP services for project needs Demonstrate in-depth knowledge of GCP PaaS, Serverless, and Database services Ensure adherence to security and regulatory standards for all cloud solutions Optimize performance, cost, and scalability of cloud-based solutions Keep abreast of the latest cloud technologies and industry trends Enhance cloud solutions by leveraging GCP GenAI solutions such as Vertex.ai Requirements Bachelor's or Master's degree in Computer Science, Information Technology or a related field Minimum of 5 years of experience in designing, implementing, and maintaining applications on GCP Proficiency in GCP services including GKE, CloudRun, Functions, Firestore, Firebase Solid understanding of cloud security best practices and implementation of security controls within GCP Thorough knowledge of cloud architecture principles and adherence to best practices Experience with Terraform and a strong understanding of DevOps principles Nice to have Familiarity with GCP GenAI solutions like Vertex.ai, Codebison, and Gemini models Hands-on experience in front-end technologies such as Angular or React
Posted -1 days ago
8.0 - 12.0 years
30 - 42 Lacs
hyderabad, pune, bengaluru
Work from Office
We are seeking a Technical Lead with strong Application Development expertise in Google Cloud Platform (GCP). The successful candidate will provide technical leadership in designing and implementing robust, scalable cloud-based solutions. If you are an experienced professional passionate about GCP technologies and committed to staying abreast of emerging trends, apply today. Responsibilities Design, develop, and deploy cloud-based solutions using GCP, establishing and adhering to cloud architecture standards and best practices Hands on coding experience in building Java Applications using GCP Native Services like GKE, CloudRun, Functions, Firestore, CloudSQL, PubSub, etc. Develop low-level application architecture designs based on enterprise standards Choose appropriate GCP services meeting functional and non-functional requirements Demonstrate comprehensive knowledge with GCP PaaS, Serverless, and Database services Provide technical leadership to development and infrastructure teams, guiding them throughout the project lifecycle Ensure all cloud-based solutions comply with security and regulatory standards Enhance cloud-based solutions optimizing performance, cost, and scalability Stay up-to-date with the latest cloud technologies and trends in the industry Familiarity with GCP GenAI solutions and models including Vertex.ai, Codebison, and Gemini models is preferred, but not required Having hands on experience in front end technologies like Angular or React will be added advantage Requirements Bachelor's or Master's degree in Computer Science, Information Technology, or a similar field Must have 8 + years of extensive experience in designing, implementing, and maintaining applications on GCP Comprehensive expertise in GCP services such as GKE, Cloudrun, Functions, Cloud SQL, Firestore, Firebase, Apigee, GCP App Engine, Gemini Code Assist, Vertex AI, Spanner, Memorystore, Service Mesh, and Cloud Monitoring Solid understanding of cloud security best practices and experience in implementing security controls in GCP Thorough understanding of cloud architecture principles and best practices Experience with automation and configuration management tools like Terraform and a sound understanding of DevOps principles Proven leadership skills and the ability to mentor and guide a technical team
Posted -1 days ago
4.0 - 9.0 years
15 - 30 Lacs
hyderabad, pune, chennai
Hybrid
Are you looking for a challenge? Working for Tech Mahindra means being surrounded by people who share your vision. if you're looking for a company that places PEOPLE first, and a place where you can be yourself and create meaningful impact, youll want to explore the many opportunities we have on our career site. Would you like to make a career with us? Come and join us, as you take up a challenging career, which we would gladly manage for you. DETAILS OF THE CURRENT OPPORTUNITY Role – GCP Backend Developer/Senior Developer Location – Chennai, Pune, Hyderabad All back end developers including lead developer must have GCP and Big Query experience. General: GIT, Scrum, Azure DevOps Backend: GCP , Big Query, Power BI, Microservice Architecture, SQL Server, DB2, Spring Boot, JSON, Java, C#, AMQP, AzureAD, HTTP, readme documentation Frontend Web: HTML5, Typescript, Angular, REST, AzureAD, Material Design Patterns, Reactive Design Patterns Exp – 5+yrs ABOUT COMPANY We are Tech Mahindra global consulting service and systems integrator that operates in over 90+ countries, delivering solutions with a unique blend of digital innovation and robust, industry-strong processes. With our promise to help our customers Scale at Speed™ We are digital changemakers – here to disrupt old ideas, blaze new trails, and help enterprises transform and scale at unparalleled speed. Services: Consulting, Technology Services, Digital Transformation, Emerging Technologies, Industry Expertise 150,000+ professionals in over 100 countries, Vision towards---- We will continue to Rise to be an agile, customer-centric, and purpose-led company, delivering best-in-class technology solutions to our stakeholders. To know more about us, visit our website https://www.techmahindra.com Please send your updated profile to pk00789764@techmahindra.com with following details: 1.Total exp: 2.Relvant exp : 3.Current ctc: 4.Expected ctc: 5.Notice period: 6.Current Location: Regards Swapna 9841034468 pk00789764@techmahindra.com Role & responsibilities Preferred candidate profile
Posted -1 days ago
8.0 - 10.0 years
10 - 15 Lacs
noida
Work from Office
Key Responsibilities: Data Testing Strategy & Execution: Design, develop, and execute comprehensive test plans and test cases for data-centric applications, ETL processes, data warehouses, data lakes, and reporting solutions. SQL-Driven Validation: Utilize advanced SQL queries to perform complex data validation, data reconciliation, data integrity checks, and data quality assurance across various financial data sources. ETL Testing: Conduct thorough testing of ETL (Extract, Transform, Load) processes, ensuring data is accurately extracted, transformed according to business rules, and loaded correctly into target systems. Data Quality Assurance: Implement and monitor data quality checks, identify data discrepancies, anomalies, and inconsistencies, and work with development and business teams to resolve issues. Performance Testing (Data Focus): Contribute to performance testing efforts for data pipelines and database operations, ensuring optimal query and data load performance. Test Data Management: Create and manage robust test data sets for various testing phases, including positive, negative, and edge case scenarios. Defect Management: Identify, document, track, and re-test defects in data, collaborating closely with development and data engineering teams for timely resolution. Documentation & Reporting: Maintain clear and concise documentation of test plans, test cases, test results, and data quality reports. Provide regular status updates to stakeholders. Collaboration: Work effectively with business analysts, data architects, data engineers, and project managers to understand data flows, business requirements, and ensure data quality standards are met. Process Improvement: Proactively identify opportunities for process improvements in data testing methodologies and tools. Global Team Collaboration: Provide consistent overlap with EST working hours (until noon EST) to facilitate effective communication and collaboration with US-based teams. Required Skills & Experience: Experience: 8-10 years of hands-on experience in Data Quality Assurance, Data Testing, or ETL Testing roles. SQL Expertise:o Advanced proficiency in SQL: Ability to write complex queries, subqueries, analytical functions (Window functions), CTEs, and stored procedures for data validation, reconciliation, and analysis.o Experience with various SQL databases (e.g., SQL Server, Oracle, PostgreSQL, MySQL, Snowflake, BigQuery).o Strong understanding of database concepts: normalization, indexing, primary/foreign keys, and data types. Data Testing Methodologies: Solid understanding of data warehousing concepts, ETL processes, and various data testing strategies (e.g., source-to-target mapping validation, data transformation testing, data load testing, data completeness, data accuracy). Domain Expertise:o Strong understanding and proven experience in Risk and Finance IT domain: Familiarity with financial data (e.g., trading data, market data, risk metrics, accounting data, regulatory reporting).o Knowledge of financial products, regulations, and risk management concepts. Analytical & Problem-Solving Skills: Excellent ability to analyze complex data sets, identify root causes of data issues, and propose effective solutions. Communication: Strong verbal and written communication skills to articulate data issues and collaborate with diverse teams. Mandatory Competencies ETL - ETL - Tester QA/QE - QA Automation - ETL Testing Database - PostgreSQL - PostgreSQL Beh - Communication Database - Sql Server - SQL Packages
Posted -1 days ago
7.0 - 12.0 years
7 - 17 Lacs
pune, chennai, bengaluru
Work from Office
• Handson experience in objectoriented programming using Python, PySpark, APIs, SQL, BigQuery, GCP • Building data pipelines for huge volume of data • Dataflow Dataproc and BigQuery • Deep understanding of ETL concepts
Posted Just now
8.0 - 12.0 years
25 - 37 Lacs
pune, bengaluru
Work from Office
re looking for an experienced GCP Technical Lead to architect, design, and lead the development of scalable cloud-based solutions. The ideal candidate should have strong expertise in Google Cloud Platform (GCP) , data engineering, and modern cloud-native architectures, along with the ability to mentor a team of engineers. Key Responsibilities Lead the design and development of GCP-based solutions (BigQuery, Dataflow, Composer, Pub/Sub, GKE, etc.) Define cloud architecture best practices and ensure adherence to security, scalability, and performance standards. Collaborate with stakeholders to understand requirements and translate them into technical designs & roadmaps . Lead and mentor a team of cloud/data engineers, providing guidance on technical challenges. Implement and optimize ETL/ELT pipelines , data lake, and data warehouse solutions on GCP. Drive DevOps/CI-CD practices using Cloud Build, Terraform, or similar tools. Ensure cost optimization, monitoring, and governance within GCP environments. Work with cross-functional teams on cloud migrations and modernization projects . Required Skills & Qualifications Strong experience in GCP services : BigQuery, Dataflow, Pub/Sub, Cloud Storage, Composer, GKE, etc. Expertise in data engineering, ETL, and cloud-native development . Hands-on experience with Python, SQL, and Shell scripting . Knowledge of Terraform, Kubernetes, and CI/CD pipelines . Familiarity with data security, IAM, and compliance on GCP . Proven experience in leading technical teams and delivering large-scale cloud solutions. Excellent problem-solving, communication, and leadership skills. Preferred GCP Professional Cloud Architect / Data Engineer certification . Experience with machine learning pipelines (Vertex AI, AI Platform)
Posted Just now
5.0 - 10.0 years
25 - 35 Lacs
gurugram
Hybrid
Were Hiring: Senior Data Engineer GCP Migration | Gurgaon (Hybrid, 3 days office per week) Location: Gurgaon (Hybrid 3 days in office per week) Start Date: End of September Experience Level: 5+ years relevant Data Engineering experience About the Role: We are looking for a highly skilled Senior Data Engineer to lead and support our strategic migration of data platforms to Google Cloud Platform (GCP) . This is a great opportunity for an experienced professional who thrives on solving complex problems and building scalable data solutions. You will play a critical role in designing, developing, and optimizing data pipelines while ensuring smooth migration from legacy systems. Key Responsibilities: Lead and execute migration of legacy data platforms to GCP. Design & build scalable data pipelines using BigQuery, DataFlow, and dbt . Orchestrate workflows with Cloud Composer (Airflow on GCP) . Collaborate with cross-functional teams to align data requirements and migration goals. Optimize data models & queries for performance and cost-efficiency. Ensure governance, security, and reliability across data platforms. Monitor and troubleshoot data pipelines in production. Required Qualifications: 5+ years of Data Engineering experience (minimum 2 years on GCP ). Strong hands-on expertise with BigQuery and DataFlow . Proficiency with dbt for data transformation and modeling. Experience with Cloud Composer / Apache Airflow . Proven track record of end-to-end migration projects . Strong SQL & Python programming skills. Familiarity with CI/CD practices and Git. Excellent problem-solving & communication skills. Preferred Qualifications: Strong understanding of data warehousing concepts & dimensional modeling. Exposure to Terraform or IaC tools for GCP provisioning. Why Join Us? Be part of a strategic cloud transformation initiative . Work with cutting-edge technologies in a collaborative, innovative environment. Opportunity to influence architecture & best practices for a modern data platform. How to Apply: If youre ready to take on this exciting challenge, share your updated profile at Vijay.S@xebia.com with the following details: Total Experience Relevant Experience Current CTC Expected CTC Notice Period ( Immediate to 2 weeks only apply if you can join early ) Current Location Preferred Location LinkedIn Profile URL
Posted Just now
10.0 - 14.0 years
11 - 15 Lacs
chennai, bengaluru
Work from Office
An experienced consulting professional who understands solutions, industry best practices, multiple business processes or technology designs within a product/technology family. Operates independently to provide quality work products to an engagement. Performs varied and complex duties and tasks that need independent judgment, in order to implement Oracle products and technology to meet customer needs. Applies Oracle methodology, company procedures, and leading practices. Operates independently to provide quality work products to an engagement. Performs varied and complex duties and tasks that need independent judgment, in order to implement Oracle products and technology to meet customer needs. Applies Oracle methodology, company procedures, and leading practices. Demonstrates expertise to deliver functional and technical solutions on moderately complex customer engagements. May act as the team lead on projects. Effectively consults with management of customer organizations. Participates in business development activities. Develops and configures detailed solutions for moderately complex projects.10-12 years of experience relevant to this position. Ability to communicate effectively. Ability to build rapport with team members and clients. Ability to travel as needed. Responsibilities The candidate is expected to have 10 to 12 years of expert domain knowledge in HCM covering the hire to retire cycle. S/he must have been a part of at least 5 end-to-end HCM implementations of which at least 2 should have been with HCM Cloud. The candidate must have expert working experience in 1 or more of these modules along with the Payroll module Time and Labor Absence Management Talent Benefits Compensation Recruiting (ORC) Core HR In-depth understanding of HCM Cloud business process and their data flow. The candidate should have been in client facing roles and interacted with customers in requirement gathering workshops, design, configuration, testing and go-live. Should have strong written and verbal communication skills, personal drive, flexibility, team player, problem solving, influencing and negotiating skills and organizational awareness and sensitivity, engagement delivery, continuous improvement and knowledge sharing and client management. Good leadership capability with strong planning and follow up skills, mentorship, Work Allocation, monitoring and status updates to Project Manager Assist in the identification, assessment and resolution of complex functional issues/problems. Interact with client frequently around specific work efforts/deliverables Candidate should be open for domestic or international travel for short as well as long duration.
Posted Just now
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
BigQuery, a powerful cloud-based data warehouse provided by Google Cloud, is in high demand in the job market in India. Companies are increasingly relying on BigQuery to analyze and manage large datasets, driving the need for skilled professionals in this area.
The average salary range for BigQuery professionals in India varies based on experience level. Entry-level positions may start at around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15-20 lakhs per annum.
In the field of BigQuery, a typical career progression may include roles such as Junior Developer, Developer, Senior Developer, Tech Lead, and eventually moving into managerial positions such as Data Architect or Data Engineering Manager.
Alongside BigQuery, professionals in this field often benefit from having skills in SQL, data modeling, data visualization tools like Tableau or Power BI, and cloud platforms like Google Cloud Platform or AWS.
As you explore opportunities in the BigQuery job market in India, remember to continuously upskill and stay updated with the latest trends in data analytics and cloud computing. Prepare thoroughly for interviews by practicing common BigQuery concepts and showcase your hands-on experience with the platform. With dedication and perseverance, you can excel in this dynamic field and secure rewarding career opportunities. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |