Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
7.0 - 9.0 years
8 - 15 Lacs
Hyderabad
Hybrid
Role & Responsibilities Role Overview : We are seeking a talented and forward-thinking Data Engineer for one of the large financial services GCC based in Hyderabad with responsibilities that include designing and constructing data pipelines, integrating data from multiple sources, developing scalable data solutions, optimizing data workflows, collaborating with cross-functional teams, implementing data governance practices, and ensuring data security and compliance. Technical Requirements : • Proficiency in ETL, Batch, and Streaming Process • Experience with BigQuery, Cloud Storage, and CloudSQL • Strong programming skills in Python, SQL, and Apache Beam for data processing • Understanding of data modeling and schema design for analytics • Knowledge of data governance, security, and compliance in GCP • Familiarity with machine learning workflows and integration with GCP ML tools • Ability to optimize performance within data pipelines Functional Requirements : • Ability to collaborate with Data Operations, Software Engineers, Data Scientists, and Business SMEs to develop Data Product Features • Experience in leading and mentoring peers within an existing development team • Strong communication skills to craft and communicate robust solutions • Proficient in working with Engineering Leads, Enterprise and Data Architects, and Business Architects to build appropriate data foundations • Willingness to work on contemporary data architecture in Public and Private Cloud environments This role offers a compelling opportunity for a seasoned Data Engineering to drive transformative cloud initiatives within the financial sector, leveraging unparalleled experience and expertise to deliver innovative cloud solutions that align with business imperatives and regulatory requirements. Qualification o Engineering Grad / Postgraduate CRITERIA o Proficient in ETL, Python, and Apache Beam for data processing efficiency. o Demonstrated expertise in BigQuery, Cloud Storage, and CloudSQL utilization. o Strong collaboration skills with cross-functional teams for data product development. o Comprehensive knowledge of data governance, security, and compliance in GCP. o Experienced in optimizing performance within data pipelines for efficiency.
Posted 1 week ago
6.0 - 9.0 years
7 - 14 Lacs
Hyderabad
Work from Office
Role Overview: We are seeking a talented and forward-thinking Data Engineer for one of the large financial services GCC based in Hyderabad with responsibilities that include designing and constructing data pipelines, integrating data from multiple sources, developing scalable data solutions, optimizing data workflows, collaborating with cross-functional teams, implementing data governance practices, and ensuring data security and compliance. Technical Requirements: 1. Proficiency in ETL, Batch, and Streaming Process 2. Experience with BigQuery, Cloud Storage, and CloudSQL 3. Strong programming skills in Python, SQL, and Apache Beam for data processing 4. Understanding of data modeling and schema design for analytics 5. Knowledge of data governance, security, and compliance in GCP 6. Familiarity with machine learning workflows and integration with GCP ML tools 7. Ability to optimize performance within data pipelines Functional Requirements: 1. Ability to collaborate with Data Operations, Software Engineers, Data Scientists, and Business SMEs to develop Data Product Features 2. Experience in leading and mentoring peers within an existing development team 3. Strong communication skills to craft and communicate robust solutions 4. Proficient in working with Engineering Leads, Enterprise and Data Architects, and Business Architects to build appropriate data foundations 5. Willingness to work on contemporary data architecture in Public and Private Cloud environments T his role offers a compelling opportunity for a seasoned Data Engineering to drive transformative cloud initiatives within the financial sector, leveraging unparalleled experience and expertise to deliver innovative cloud solutions that align with business imperatives and regulatory requirements . Qualification Engineering Grad / Postgraduate CRITERIA 1. Proficient in ETL, Python, and Apache Beam for data processing efficiency. 2. Demonstrated expertise in BigQuery, Cloud Storage, and CloudSQL utilization. 3. Strong collaboration skills with cross-functional teams for data product development. 4. Comprehensive knowledge of data governance, security, and compliance in GCP. 5. Experienced in optimizing performance within data pipelines for efficiency. 6. Relevant Experience: 6-9 years Connect at 9993809253
Posted 1 week ago
6.0 - 10.0 years
12 - 18 Lacs
Hyderabad
Hybrid
Role & Responsibilities Role Overview: We are seeking a talented and forward-thinking Data Engineer for one of the large financial services GCC based in Hyderabad with responsibilities that include designing and constructing data pipelines, integrating data from multiple sources, developing scalable data solutions, optimizing data workflows, collaborating with cross-functional teams, implementing data governance practices, and ensuring data security and compliance. Technical Requirements: Proficiency in ETL, Batch, and Streaming Process Experience with BigQuery, Cloud Storage, and CloudSQL Strong programming skills in Python, SQL, and Apache Beam for data processing Understanding of data modeling and schema design for analytics Knowledge of data governance, security, and compliance in GCP Familiarity with machine learning workflows and integration with GCP ML tools Ability to optimize performance within data pipelines Functional Requirements: Ability to collaborate with Data Operations, Software Engineers, Data Scientists, and Business SMEs to develop Data Product Features Experience in leading and mentoring peers within an existing development team Strong communication skills to craft and communicate robust solutions Proficient in working with Engineering Leads, Enterprise and Data Architects, and Business Architects to build appropriate data foundations Willingness to work on contemporary data architecture in Public and Private Cloud environments This role offers a compelling opportunity for a seasoned Data Engineering to drive transformative cloud initiatives within the financial sector, leveraging unparalleled experience and expertise to deliver innovative cloud solutions that align with business imperatives and regulatory requirements. Qualification Engineering Grad / Postgraduate CRITERIA Proficient in ETL, Python, and Apache Beam for data processing efficiency. Demonstrated expertise in BigQuery, Cloud Storage, and CloudSQL utilization. Strong collaboration skills with cross-functional teams for data product development. Comprehensive knowledge of data governance, security, and compliance in GCP. Experienced in optimizing performance within data pipelines for efficiency. Relevant Experience: 6-9 years
Posted 1 week ago
8.0 - 10.0 years
40 - 45 Lacs
Mumbai, New Delhi, Bengaluru
Work from Office
Roles & Responsibilities: Data Engineering Leadership & Strategy: Lead and mentor a team of data engineers, fostering a culture of technical excellence and collaboration. Define and implement data engineering best practices, standards, and processes. Data Pipeline Architecture & Development: Design, build, and maintain scalable, robust, and efficient data pipelines for ingestion, transformation, and loading of data from various sources. Optimize data pipelines for performance, reliability, and cost-effectiveness. Implement data quality checks and monitoring systems to ensure data integrity. Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver high-quality data solutions. Cloud-Based Data Infrastructure: Design, implement, and manage cloud-based data infrastructure using platforms like AWS, Azure, or GCP. Leverage cloud services (e.g., data lakes, data warehouses, serverless computing) to build scalable and cost-effective data solutions. Leverage opensource airbyte , mage ai and similar Ensure data security, governance, and compliance within the cloud environment. Data Modeling & Warehousing: Design and implement data models to support business intelligence, reporting, and analytics. Optimize data warehouse performance for efficient querying and reporting. Collaboration & Communication: Collaborate effectively with cross-functional teams including product managers, software engineers, and business stakeholders. Requirements: Bachelor's or master's degree in computer science, Engineering, or a related field. 8+ years of proven experience in data engineering, with at least 3+ years in a lead role. Expertise in building and maintaining data pipelines using tools such as Apache Spark, Apache Kafka, Apache Beam, or similar. Proficiency in SQL and one or more programming languages like Python, Java, or Scala. Hands-on experience with cloud-based data platforms (AWS, Azure, GCP) and services. Locations : Mumbai, Delhi NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote Work Timings: 2.30 pm - 11.30 pm IST
Posted 2 weeks ago
6 - 8 years
11 - 16 Lacs
Pune, Hyderabad
Work from Office
Job Description : We are looking for a Senior Data Engineer with strong experience in data pipelines , ETL processes , and data visualization . The ideal candidate will have hands-on expertise with Apache Superset , Apache Beam , Apache Flink , and Apache Airflow . Key Responsibilities : Design, develop, and maintain end-to-end data pipelines using Apache Beam and Apache Flink for both real-time and batch data processing . Build, maintain, and optimize data orchestration workflows using Apache Airflow DAGs for task automation and scheduling. Leverage Apache Superset to create interactive data visualizations, dashboards, and reports for business intelligence and data exploration. Collaborate with cross-functional teams to gather requirements and translate them into technical solutions for data engineering projects. Ensure data quality, consistency, and security across all data platforms. Implement and optimize complex SQL queries on relational databases and manage data storage using SQL , MongoDB , or PostgreSQL . Troubleshoot, debug, and optimize data processing workflows to ensure high performance and scalability. Continuously improve and innovate data engineering processes and technologies to enhance the performance and usability of data systems. Preferred candidate profile : Key Skills : 6+ years of experience with Apache Superset , Apache Beam , Apache Flink , and Apache Airflow . Proficiency in SQL , MongoDB , or PostgreSQL . Strong problem-solving skills and experience with data engineering best practices.
Posted 2 months ago
14 - 20 years
25 - 40 Lacs
Pune
Hybrid
Skills Data Engineering, Must have experience with Java & Scala based implementations for enterprise-wide platforms. Experience with Apache Beam, Google Dataflow, Apache Kafka for real-time steam processing technology stack. Complex state-full processing of events with partitioning for higher throughputs. Role & responsibilities Must Have - Either of the below combinations: Java + Apache Beam (Dataflow / Flink) & GCP 2. Python + Apache Beam (Dataflow / Flink) & GCP 3. Java + Apache Spark 4. Python + Apache Spark
Posted 2 months ago
5 - 8 years
6 - 10 Lacs
Chennai, Hyderabad
Work from Office
Notice Period Immediate joiners only (1 week or less) We are seeking a Senior GCP Data Engineer with strong expertise in Teradata (mandatory) and Google Cloud Platform (GCP) to design, develop, and optimize data pipelines and cloud-based data solutions. Key Responsibilities: Design and implement scalable data pipelines using GCP services across multiple projects. Develop and deploy ETL/ELT processes using Python, Apache Beam, and SQL scripts. Implement CI/CD pipelines with Jenkins or cloud-native tools to automate deployment, testing, and integration of data applications. Lead a team of data engineers, ensuring best practices in data engineering and performance optimization. Optimize workflows for cost efficiency, performance, and scalability on GCP. Design and manage workflows to integrate data from diverse sources using GCP services and Cloud Composer (Apache Airflow). Set up monitoring, logging, and alerting using tools like Cloud Monitoring, Datadog, etc. to track pipeline performance and quickly resolve issues. Work on data migration projects from legacy databases such as Oracle, Teradata, and SQL Server to GCP cloud data platforms. Collaborate with application developers, data architects, and business stakeholders to develop robust data-driven solutions. Facilitate agile processes including sprint planning, daily stand-ups, and backlog grooming. Engage with client stakeholders on data, BI, and analytics programs, ensuring seamless communication and governance. Required Skills & Qualifications: Mandatory expertise in Teradata and GCP (Google Cloud Platform). Strong experience in data engineering, data pipelines, ETL/ELT, and cloud-native architectures. Proficiency in Python, SQL, Apache Beam, and cloud-based orchestration tools. Hands-on experience with CI/CD, automation, and monitoring tools. Strong problem-solving skills, with the ability to optimize and troubleshoot complex data workflows. Experience in stakeholder management and ability to communicate technical solutions effectively
Posted 2 months ago
6 - 8 years
14 - 18 Lacs
Bengaluru, Kolkata
Work from Office
*Must have exposure in GCP Machine Learning, vertex AI. Collaborated with data engineers, and software developers to facilitate the deployment, monitoring, and maintenance of machine learning models. Experience in training and deploying AutoML and custom tabular/image models using Vertex AI. Hands-on experience in designing, implementing, and maintaining robust data pipelines, data transformation processes, and data storage solutions on GCP, utilizing Apache Beam, Dataflow, and BigQuery. Demonstrated expertise in training, evaluating, and tuning ML models in BigQuery. Hands-on experience with Docker, Kubernetes, Kubeflow, and Cloud Build. Successfully created and managed CI/CD pipelines for machine learning model deployment, ensuring automation, and reproducibility with Cloud Build on GCP. Familiar with security best practices including IAM, Network security, and data encryption in GCP. Experienced in troubleshooting issues across E2E ML lifecycle from data preprocessing to model serving. Adept with Vertex AIs Generative AI studio. Hands-on experience in using Vertex AI PaLM API, including text-bison, chat-bison, and text embedding-gecko. Knowledgeable in Prompt engineering design, techniques, and best practices. Skilled in creating prompts for ideation, text classification, text extraction, and text summarization.**
Posted 2 months ago
7 - 10 years
25 - 35 Lacs
Indore, Bengaluru, Hyderabad
Work from Office
Jo b Description: GCP Lead Developer Location: Hyderabad, Bangalore and Indore Notice period - 15-30days Max What we are looking for: 7+ years of experience in data analytics with at least 4+ years in tech-lead role. Expertise in building solution architecture, provision infrastructure, secure and reliable data-centric services, and application in GCP Minimum 3-5 year of designing, building and operationalizing enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Spark, Cloud DataProc, Cloud Dataflow, Apache Beam, BigTable, Cloud BigQuery, Cloud PubSub, Cloud Functions, etc. Minimum 2 year of designing and building production data pipelines from ingestion to consumption within a hybrid big data architecture, using Java, Python, Scala etc. Ability to formulate business problems as technical data problems while ensuring key business drivers are captured in collaboration with business users. Work with data team to efficiently use Cloud infrastructure to analyze data, build models, and generate reports/visualizations. Extracting, Loading, Transforming, cleaning, and validating data; Integrate datasets from multiple data sources for data modelling. Excellent programming skills with SQL and python Strong communication and analytical skills Preferable GCP Professional Data Engineer certification Able to drive workshops with the business users to gather requirements and translate those to analytics solutions. Drive and Contribute presales activities including proposals, POC and Demos. Working at TekLink At TekLink our employees get an open and collaborative environment and gain experience working for customers in various industries while solving complex business problems. They get support to learn new technologies as well as to enhance existing skills to further their career growth. We offer: Competitive remuneration Excellent Benefits including Health, accidental and Life coverage. Excellent performance based annual increments. Fast paced growth opportunities International work experience J portunity to participate in various sports and CSR activities.J
Posted 3 months ago
10 - 18 years
30 - 45 Lacs
Chennai
Work from Office
Details on tech stack GCP Services : BigQuery, Cloud Dataflow, Pub/Sub, Dataproc, Cloud Storage. Data Processing : Apache Beam (batch/stream), Apache Kafka, Cloud Dataprep. Programming : Python, Java/Scala, SQL. Orchestration : Apache Airflow (Cloud Composer), Terraform. Security : IAM, Cloud Identity, Cloud Security Command Center. Containerization : Docker, Kubernetes (GKE). Machine Learning : Google AI Platform, TensorFlow, AutoML. Certifications : Google Cloud Data Engineer, Cloud Architect (preferred). Proven ability to design scalable and robust AI/ML systems in production, with a focus on high-performance and cost-effective solutions. Strong experience with cloud platforms (Google Cloud, AWS, Azure) and cloud-native AI/ML services (e.g., Vertex AI, SageMaker). Expertise in implementing MLOps practices, including model deployment, monitoring, retraining, and version control. Strong leadership skills with the ability to guide teams, mentor engineers, and collaborate with cross-functional teams to meet business objectives. Deep understanding of frameworks like TensorFlow, PyTorch, and Scikit-learn for designing, training, and deploying models. Experience with data engineering principles, scalable pipelines, and distributed systems (e.g., Apache Kafka, Spark, Kubernetes). Nice to have requirements to the candidate Strong leadership and mentorship capabilities, guiding teams toward best practices and high-quality deliverables. Excellent problem-solving skills, with a focus on designing efficient, high-performance systems. Effective project management abilities to handle multiple initiatives and ensure timely delivery. Strong emphasis on collaboration and teamwork , fostering a positive and productive work environment.
Posted 3 months ago
7 - 12 years
37 - 45 Lacs
Pune
Work from Office
Responsible to provide fast and reliable data solutions for warehousing, reporting, Customer- and Business Intelligence solutions. Loading data from various systems of record into our platform and make them available for further use. Automate deployment and test processes to deliver fast incremental improvements of our application and platform. Implement data governance and protection to adhere regulatory requirements and policies. Transform and combine data into a data model which supporting our data analysts or can easily consumed by operational databases. Maintain hygiene, Risk and Control and Stability at to core to every delivery. Be a role model for the team. Work in an agile setup, helping with feedback to improve our way of working Commercial Banking Tribe Youll be joining the Commercial Bank Tribe, who is focusing on the special needs of the small and medium enterprise clients in Germany, a designated area for further growth and investment within Corporate Bank. We are responsible for the digital transformation of ~800.000 clients in 3 brands, i.e. the establishment of the BizBanking platform including development of digital sales and service processes as well as the automation of processes for this client segment. Our tribe is on a journey of an extensive digitalisation of business processes and to migrate our applications to the cloud. On that we are working jointly together with our business colleagues in an agile setup and collaborating closely with stakeholders and engineers from other areas thriving to achieve a highly automated and adoptable process and application landscape.. Your key responsibilities Design, develop, and deploy data processing pipelines and data-driven applications on GCP Write and maintain SQL queries and use data modeling tools like Dataform or dbt for data management. Write clean, maintainable code in Java and/or Python, adhering to clean code principles. Apply concepts of deployments and configurations in GKE/OpenShift, and implement infrastructure as code using Terraform. Set up and maintain CI/CD pipelines using GitHub Actions, write and maintain unit and integration tests. Your skills and experience Bachelor's degree in Computer Science, Data Science, or related field, or equivalent work experience. Proven experience as a Data Engineer or Backend Engineer or similar role. Strong experience with Cloud, Terraform, and GitHub Actions. Proficiency in SQL and Java and/or Python, experience with tools and frameworks like Apache Beam, Spring Boot and Apache Airflow. Familiarity with data modeling tools like Dataform or dbt, and experience writing unit and integration tests. Understanding of clean code principles and commitment to writing maintainable code. Excellent problem-solving skills, attention to detail, and strong communication skills.
Posted 3 months ago
6 - 8 years
8 - 12 Lacs
Pune
Work from Office
Responsible to provide fast and reliable data solutions for warehousing, reporting, Customer- and Business Intelligence solutions. Loading data from various systems of record into our platform and make them available for further use. Automate deployment and test processes to deliver fast incremental improvements of our application and platform. Transform and combine data into a data model which supporting our data analysts or can easily consumed by operational databases. Create the best code to fulfill the requirements of our business unit and support our customers with the best possible products Maintain hygiene, Risk and Control and Stability at to core to every delivery. Work in an agile setup, helping with feedback to improve our way of working. Commercial Banking Tribe Youll be joining the Commercial Bank Tribe, who is focusing on the special needs of the small and medium enterprise clients in Germany, a designated area for further growth and investment within Corporate Bank. We are responsible for the digital transformation of ~800.000 clients in 3 brands, i.e. the establishment of the BizBanking platform including development of digital sales and service processes as well as the automation of processes for this client segment. Our tribe is on a journey of an extensive digitalisation of business processes and to migrate our applications to the cloud. On that we are working jointly together with our business colleagues in an agile setup and collaborating closely with stakeholders and engineers from other areas thriving to achieve a highly automated and adoptable process and application landscape. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. Your key responsibilities Design, develop, and deploy data processing pipelines and data-driven applications on GCP Write and maintain SQL queries and use data modeling tools like Dataform or dbt for data management. Write clean, maintainable code in Java and/or Python, adhering to clean code principles. Apply concepts of deployments and configurations in GKE/OpenShift, and implement infrastructure as code using Terraform. Set up and maintain CI/CD pipelines using GitHub Actions, write and maintain unit and integration tests. Your skills and experience Bachelor's degree in Computer Science, Data Science, or related field, or equivalent work experience. Strong experience with Cloud, Terraform, and GitHub Actions. Proficiency in SQL and Java and/or Python, experience with tools and frameworks like Apache Beam, Spring Boot and Apache Airflow. Familiarity with data modeling tools like dbt or dataform, and experience writing unit and integration tests. Understanding of clean code principles and commitment to writing maintainable code. Excellent problem-solving skills, attention to detail, and strong communication skills.
Posted 3 months ago
3 - 6 years
5 - 8 Lacs
Pune
Work from Office
An Engineer is responsible for designing, developing and delivering significant components of engineering solutions to accomplish business goals. Key responsibilities of this role include active participation in the design and development of new features of application enhancement, investigating re-use, ensuring that solutions are fit for purpose and maintainable, and can be integrated successfully into the overall solution and environment with clear, robust and well tested deployments. Assists more junior members of the team and controls their work where applicable. Your key responsibilities Develops and deploys source code, including infrastructure and application related configurations, for all Software Components in accordance with Detailed Software Requirements specification. Provides Development activities for Projects and technical infrastructure components (i.e., Java+Python, Apache Beam/ Spark configurations and Security, Data Workflows on-premise /cloud, Infrastructure as a Code) and source code development. Debugs, fixes and provides support to L3 team. Verifies the developed source code by reviews (4-eyes principle). Contributes to quality assurance by writing and conducting unit testing. Ensures architectural changes (as defined by Architects) are implemented. Contributes to problem and root cause analysis. Integrates software components following the integration strategy. Verifies integrated software components by unit and integrated software testing according to the software test plan. Software test findings must be resolved. Work on technical continuous improvements across the modules regularly Where applicable, develops routines to deploy CIs to the target environments. Provides Release Deployments on non-Production Management controlled environments. Supports creation of Software Product Training Materials, Software Product User Guides, and Software Product Deployment Instructions. Checks consistency of documents with the respective Software Product Release. Where applicable, manages maintenance of applications and performs technical change requests scheduled according to Release Management processes. Fixes software defects/bugs, measures and analyses code for quality. Collaborates with colleagues participating in other stages of the Software Development Lifecycle (SDLC). Identifies dependencies between software product components, between technical components, and between applications and interfaces. Identifies product integration verifications to be performed based on the integration sequence and relevant dependencies. Your skills and experience General Skills Bachelor of Science degree from an accredited college or university with a concentration in Computer Science or Software Engineering (or equivalent) with a minor in Finance, Mathematics or Engineering. Strong analytical skills. Proficient communication skills. Fluent in English (written/verbal). Ability to work in virtual teams and in matrixed organizations. Excellent team player. Open minded. Keeps pace with technical innovation. Understands the relevant business area Ability to share information, transfer knowledge and expertise to team members. Ability to design and write code in accordance with provided business requirements Ability to contribute to QA strategy and Architecture decisions. Knowledge of IT delivery and architecture including knowledge of Data Modelling ,data analysis. Relevant Financial Services experience. Ability to work in a fast-paced environment with competing and alternating priorities with a constant focus on delivery. Ability to balance business demands and IT fulfilment in terms of standardization, reducing risk and increasing IT flexibility. Domain Specific Skills Desirable are one or more of the following subject areas: Very Good knowledge of the following technologies are needed: Java / Scala Apache Spark / Apache Beam GCP (Data Engineering services) Workflow orchestrators like Airflow Automation through Python/ Terraform Good to have - AI/ ML, Data analysis, Python development Very good Knowledge about the core processes / tools such as HP ALM, Jira, Service Now, SDLC, Agile processes.
Posted 3 months ago
4 - 8 years
6 - 10 Lacs
Pune
Work from Office
Role Description An Engineer is responsible for designing, developing and delivering significant components of engineering solutions to accomplish business goals.Key responsibilities of this role include active participation in the design and development of new features of application enhancement, investigating re-use, ensuring that solutions are fit for purpose and maintainable, and can be integrated successfully into the overall solution and environment with clear, robust and well tested deployments. Assists more junior members of the team and controls their work where applicable. Your key responsibilities Develops and deploys source code, including infrastructure and application related configurations , for all Software Components in accordance with Detailed Software Requirements specification. Provides Development leading activities for Projects and technical infrastructure components (i.e., Java+Python, Apache Beam/ Spark configurations and Security, Data Workflows on-premise /cloud, Infrastructure as a Code) and source code development. Debugs, fixes and provides support to L3 team. Verifies the developed source code by reviews (4-eyes principle). Contributes to quality assurance by writing and conducting unit testing. Ensures architectural changes (as defined by Architects) are implemented. Drives low level designs Contributes to problem and root cause analysis. Integrates software components following the integration strategy. Verifies integrated software components by unit and integrated software testing according to the software test plan. Software test findings must be resolved. Drives technical continuous improvements across the modules regularly Where applicable, develops routines to deploy CIs to the target environments. Provides Release Deployments on non-Production Management controlled environments. Supports creation of Software Product Training Materials, Software Product User Guides, and Software Product Deployment Instructions. Checks consistency of documents with the respective Software Product Release. Where applicable, manages maintenance of applications and performs technical change requests scheduled according to Release Management processes. Fixes software defects/bugs, measures and analyses code for quality. Collaborates with colleagues participating in other stages of the Software Development Lifecycle (SDLC). Identifies dependencies between software product components, between technical components, and between applications and interfaces. Identifies product integration verifications to be performed based on the integration sequence and relevant dependencies. Suggests and implements continuous technical improvements on the applications (Scalability, Reliability, Availability , Performance) Your skills and experience General Skills Bachelor of Science degree from an accredited college or university with a concentration in Computer Science or Software Engineering (or equivalent) with a minor in Finance, Mathematics or Engineering. Strong analytical skills. Proficient communication skills. Fluent in English (written/verbal). Ability to work in virtual teams and in matrixed organizations. Excellent team player. Open minded. Keeps pace with technical innovation. Understands the relevant business area Ability to share information, transfer knowledge and expertise to team members. Ability to design and write code in accordance with provided business requirements Ability to contribute to QA strategy and Architecture decisions. Knowledge of IT delivery and architecture including knowledge of Data Modelling ,data analysis. Relevant Financial Services experience. Ability to work in a fast-paced environment with competing and alternating priorities with a constant focus on delivery. Ability to balance business demands and IT fulfilment in terms of standardization, reducing risk and increasing IT flexibility. Domain Specific Skills Desirable are one or more of the following subject areas: Very Good knowledge of the following technologies are needed: Java / Scala Apache Spark / Apache Beam GCP (Data Engineering services) Workflow orchestrators like Airflow Automation through Python/ Terraform Good to have - AI/ ML, Data analysis, Python development Very good Knowledge about the core processes / tools such as HP ALM, Jira, Service Now, SDLC, Agile processes.
Posted 3 months ago
10 - 20 years
35 - 40 Lacs
Pune
Work from Office
Senior Engineer is responsible for designing and developing entire engineering solutions to accomplish business goals. Key responsibilities of this role include ensuring that solutions are well architected, with maintainability and ease of testing built in from the outset, and that they can be integrated successfully into the end-to-end business process flow. They will have gained significant experience through multiple implementations and have begun to develop both depth and breadth in a number of engineering competencies. They have extensive knowledge of design and architectural patterns. They will provide engineering thought leadership within their teams, and will play a role in mentoring and coaching of less experienced engineers. Your key responsibilities Hands-on software development Solution Design and Architecture ownership Experience in Agile and Scrum delivery Should be able to contribute towards good software design Participate in daily stand-up meetings Strong communication with stakeholders Articulate issues and risks to management in timely manner Train other team members to bring them up to speed Your skills and experience Extensive experience with java and related technologies such as Spring Core/Spring Boot/Hibernate/MyBatis Experience in developing application using data processing frameworks such as Spring Batch, Apache Beam, Apache Storm Experience with a wide variety of open source tools and frameworks JMS/JPA/JAX-WS/JAX-RS/JAX-B/JTA standards Xml binding. Parsers and xml schemas/xpath/xslt Experience with SSL/X.509 Certificates/Keystores Core java concepts such as lambdas and functional programming, streams, Generics, Concurrency Memory Management Tuning and Troubleshooting, experience with profiling and monitoring tools Knowledge of solution design and architecture including UML Design Patterns Refactoring Architecture decisions, quality attributes, documentation Experience in Agile Experience with Messaging and integration, Patterns, REST, SOA Experience with build and deployment Maven/Artifactory/Teamcity or Jenkins Unix scripting and hands on experience Performance engineering , different types of tests, measurement, monitoring, tools Performance tuning and troubleshooting Knowledge of emerging trends and technologies Experience with end to end design and delivery of solutions RDBMS Oracle design, development, tuning Nice to have Experience with cloud technologies such as Docker, Kubernetes, Openshift, Azure Experience with Big data Streaming technologies Experience with UI frameworks like Angular or React Any additional languages such as python, scala Sun/Oracle or architecture specific certifications Educational Qualifications Bachelors Masters in Computer Science or relevant field.
Posted 3 months ago
5 - 10 years
15 - 30 Lacs
Pune, Bengaluru, Hyderabad
Work from Office
Preferred candidate profile Experience- 5 to 10 Years Notice Period- 30 Days max Location- Hyderabad, Pune, Bengaluru, Delhi Key Skills - Proficiency in programming languages: Python, Java - Expertise in data processing frameworks: Apache Beam (Data Flow) - Active experience on GCP tools and technologies like B ig Query, Dataflow, Cloud Composer , Cloud Spanner, GCS, DBT etc., - Data Engineering skillset using Python, SQL - Experience in ETL (Extract, Transform, Load) processes
Posted 3 months ago
3 - 8 years
0 - 2 Lacs
Chennai, Bengaluru, Hyderabad
Hybrid
Position: Software Engineer / Senior Software Engineer Location : Anywhere in India Work Mode : Hybrid/Remote But if they are in any of the Brillio locations, Bangalore, Pune, Chennai, Hyderabad or Gurgaon then they might have to come to the office. 1-5+ years of experience in software design and development 1 years of experience in the data engineering field is preferred 1+ years of Hands-on experience in GCP cloud data implementation suite such as Big Query, Pub Sub, Data Flow/Apache Beam, Airflow/Composer, Cloud Storage Experience in dialog flow and java programming is must Strong experience and understanding of very large-scale data architecture, solutioning, and operationalization of data warehouses, data lakes, and analytics platforms. Hands on Strong Experience in the below technology 1. GBQ Query 2. Python 3. Apache Airflow 4. SQL (BigQuery preferred) Extensive hands-on experience working with data using SQL and Python Cloud Functions. Comparable skills in AWS and other cloud Big Data Engineering space is considered. Experience with agile development methodologies Excellent verbal and written communications skills with the ability to clearly present ideas, concepts, and solutions Bachelor's Degree in Computer Science, Information Technology, or closely related discipline
Posted 3 months ago
8 - 12 years
40 - 45 Lacs
Delhi NCR, Mumbai, Bengaluru
Work from Office
Roles & Responsibilities: Data Engineering Leadership & Strategy: Lead and mentor a team of data engineers, fostering a culture of technical excellence and collaboration. Define and implement data engineering best practices, standards, and processes. Data Pipeline Architecture & Development: Design, build, and maintain scalable, robust, and efficient data pipelines for ingestion, transformation, and loading of data from various sources. Optimize data pipelines for performance, reliability, and cost-effectiveness. Implement data quality checks and monitoring systems to ensure data integrity. Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver high-quality data solutions. Cloud-Based Data Infrastructure: Design, implement, and manage cloud-based data infrastructure using platforms like AWS, Azure, or GCP. Leverage cloud services (e.g., data lakes, data warehouses, serverless computing) to build scalable and cost-effective data solutions. Leverage opensource airbyte , mage ai and similar Ensure data security, governance, and compliance within the cloud environment. Data Modeling & Warehousing: Design and implement data models to support business intelligence, reporting, and analytics. Optimize data warehouse performance for efficient querying and reporting. Collaboration & Communication: Collaborate effectively with cross-functional teams including product managers, software engineers, and business stakeholders. Requirements: Bachelor's or master's degree in computer science, Engineering, or a related field. 8+ years of proven experience in data engineering, with at least 3+ years in a lead role. Expertise in building and maintaining data pipelines using tools such as Apache Spark, Apache Kafka, Apache Beam, or similar. Proficiency in SQL and one or more programming languages like Python, Java, or Scala. Hands-on experience with cloud-based data platforms (AWS, Azure, GCP) and services. Work Timings: 2.30 pm - 11.30 pm IST Location-Remote,Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad
Posted 3 months ago
5 - 10 years
35 - 50 Lacs
Delhi, Bengaluru, Mumbai (All Areas)
Work from Office
Role: Technology Engineer- RTS(Real time streaming) Location: Dubai(On-site) Job Description: Minimum 5+ years of development and design experience in Java/Scala/Python with Flink, Beam (or Spark Streaming using Real Time data and not batch data) and Kafka. • Containerization and orchestration (Docker and OpenShift/Kubernetes) • Kafka, Flink, Beam / Kafka streams, Apache Spark or similar streaming technologies • Source control (GIT), automated build/deployment pipelines (Jenkins, ArgoCD, Kaniko, Shipwright etc.) • Public Cloud preferably Azure and OCI • Linux OS configuration and using shell scripting • Working with streaming data sets at scale • Public Cloud automation toolset • Cloud Native applications • Understanding of GitOps • Extensive coding experience and knowledge in event driven and streaming architecture • Good hands-on experience with design patterns and their implementation • Experience doing automated unit and integration testing • Well versed with CI/CD principles (GitHub, Jenkins etc.), and actively involved in solving, troubleshooting issues in distributed services ecosystem • Familiar with Distributed services resiliency and monitoring in a production environment. • Responsible for adhering to established policies, following best practices, developing, and possessing an in-depth understanding of exploits and vulnerabilities, resolving issues by taking the appropriate corrective action. • High level knowledge of compliance and regulatory requirements of data including but not limited to encryption, anonymization, data integrity, policy control features in large scale infrastructures • Understand data sensitivity in terms of logging, events and in memory data storage such as no card numbers or personally identifiable data in logs. • Distributed systems • Network fundamentals and host-level routing • Tuning distributed systems • Automated testing • Security engineering practices and tools. • Event driven architecture • Experience in Agile methodology. • Ensure quality of technical and application architecture and design of systems across the organization. • Analytical thinking • Effectively research and benchmark technology against other best in class technologies. • Able to influence multiple teams on technical considerations, increasing their productivity and effectiveness, by sharing deep knowledge and experience. • Self-motivator and self-starter, ability to own and drive things without supervision and works collaboratively with the teams across the organization.
Posted 3 months ago
2 - 6 years
4 - 8 Lacs
Pune
Work from Office
Job Title- Technical specialist- Data Engineer Role Description An Engineer is responsible for designing, developing and delivering significant components of engineering solutions to accomplish business goals. Key responsibilities of this role include active participation in the design and development of new features of application enhancement, investigating re-use, ensuring that solutions are fit for purpose and maintainable, and can be integrated successfully into the overall solution and environment with clear, robust and well tested deployments. Assists more junior members of the team and controls their work where applicable. Your key responsibilities Develops and deploys source code, including infrastructure and application related configurations, for all Software Components in accordance with Detailed Software Requirements specification. Provides Development activities for Projects and technical infrastructure components (i.e., Java+Python, Apache Beam/ Spark configurations and Security, Data Workflows on-premise /cloud, Infrastructure as a Code) and source code development. Debugs, fixes and provides support to L3 team. Verifies the developed source code by reviews (4-eyes principle). Contributes to quality assurance by writing and conducting unit testing. Ensures architectural changes (as defined by Architects) are implemented. Contributes to problem and root cause analysis. Integrates software components following the integration strategy. Verifies integrated software components by unit and integrated software testing according to the software test plan. Software test findings must be resolved. Work on technical continuous improvements across the modules regularly Where applicable, develops routines to deploy CIs to the target environments. Provides Release Deployments on non-Production Management controlled environments. Supports creation of Software Product Training Materials, Software Product User Guides, and Software Product Deployment Instructions. Checks consistency of documents with the respective Software Product Release. Where applicable, manages maintenance of applications and performs technical change requests scheduled according to Release Management processes. Fixes software defects/bugs, measures and analyses code for quality. Collaborates with colleagues participating in other stages of the Software Development Lifecycle (SDLC). Identifies dependencies between software product components, between technical components, and between applications and interfaces. Identifies product integration verifications to be performed based on the integration sequence and relevant dependencies. General Skills Bachelor of Science degree from an accredited college or university with a concentration in Computer Science or Software Engineering (or equivalent) with a minor in Finance, Mathematics or Engineering. Strong analytical skills. Proficient communication skills. Fluent in English (written/verbal). Ability to work in virtual teams and in matrixed organizations. Excellent team player. Open minded. Keeps pace with technical innovation. Understands the relevant business area Ability to share information, transfer knowledge and expertise to team members. Ability to design and write code in accordance with provided business requirements Ability to contribute to QA strategy and Architecture decisions. Knowledge of IT delivery and architecture including knowledge of Data Modelling ,data analysis. Relevant Financial Services experience. Ability to work in a fast-paced environment with competing and alternating priorities with a constant focus on delivery. Ability to balance business demands and IT fulfilment in terms of standardization, reducing risk and increasing IT flexibility. Domain Specific Skills Desirable are one or more of the following subject areas: Very Good knowledge of the following technologies are needed: Java / Scala Apache Spark / Apache Beam GCP (Data Engineering services) Workflow orchestrators like Airflow Automation through Python/ Terraform Good to have - AI/ ML, Data analysis, Python development Very good Knowledge about the core processes / tools such as HP ALM, Jira, Service Now, SDLC, Agile processes.
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2