Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
8 - 12 Lacs
Bengaluru
Work from Office
About Us Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount . With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. WHY JOIN CAPCO You will work on engaging projects with some of the largest banks in the world, on projects that will transform the financial services industry. We offer A work culture focused on innovation and creating lasting value for our clients and employees Ongoing learning opportunities to help you acquire new skills or deepen existing expertise A flat, non-hierarchical structure that will enable you to work with senior partners and directly with clients A diverse, inclusive, meritocratic culture We offer: A work culture focused on innovation and creating lasting value for our clients and employees Ongoing learning opportunities to help you acquire new skills or deepen existing expertise A flat, non-hierarchical structure that will enable you to work with senior partners and directly with clients Role Description: Over 10 years experience in risk, either in an investment bank or clearing house. Excellent Analytical skills and Risk Management concepts . Strong knowledge on Collateral product Very good understanding of the regulatory background Strong understanding of risk data and E2E data flows combined with strong data analysis skills Excellent project documentation skills and good experience with project delivery and test governance process through Agile methodology. Evidence of covering the full project lifecycle from inception through to system delivery with IT and process embedding with the sponsoring business function. Strong stakeholder management skills- Ability to collaborate with business and technology teams to understand requirements and share with the technical team
Posted 3 weeks ago
6.0 - 8.0 years
30 - 35 Lacs
Pune
Work from Office
: Job TitleSenior Engineer LocationPune, India Corporate TitleAVP Role Description Investment Banking is technology centric businesses, with an increasing move to real-time processing, an increasing appetite from customers for integrated systems and access to supporting data. This means that technology is more important than ever for business. The IB CARE Platform aims to increase the productivity of both Google Cloud and on-prem application development by providing a frictionless build and deployment platform that offers service and data reusability. The platform provides the chassis and standard components of an application ensuring reliability, usability and safety and gives on-demand access to services needed to build, host and manage applications on the cloud/on-prem. In addition to technology services the platform aims to have compliance baked in, enforcing controls/security reducing application team involvement in SDLC and ORR controls enabling teams to focus more on application development and release to production faster. We are looking for a platform engineer to join a global team working across all aspects of the platform from GCP/on-prem infrastructure and application deployment through to the development of CARE based services. Deutsche Bank is one of the few banks with the scale and network to compete aggressively in this space, and the breadth of investment in this area is unmatched by our peers. Joining the team is a unique opportunity to help build a platform to support some of our most mission critical processing systems. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities As a CARE platform engineer you will be working across the board on activities to build/support the platform and liaising with tenants. To be successful in this role the below are key responsibility areas: Responsible for managing and monitoring cloud computing systems and providing technical support to ensure the systems efficiency and security Work with platform leads and platform engineers at technical level. Liaise with tenants regarding onboarding and providing platform expertise. Contribute to the platform offering as part of Sprint deliverables. Support the production platform as part of the wider team. Your skills and experience Understanding of GCP and services such as GKE, IAM, identity services and Cloud SQL. Kubernetes/Service Mesh configuration. Experience in IaaS tooling such as Terraform. Proficient in SDLC / DevOps best practices. Github experience including Git workflow. Exposure to modern deployment tooling, such as ArgoCD, desirable. Programming experience (such as Java/Python) desirable. A strong team player comfortable in a cross-cultural and diverse operating environment Result oriented and ability to deliver under tight timelines. Ability to successfully resolve conflicts in a globally matrix driven organization. Excellent communication and collaboration skills Must be comfortable with navigating ambiguity to extract meaningful risk insights. How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 3 weeks ago
4.0 - 8.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Back At BCE Global Tech, immerse yourself in exciting projects that are shaping the future of both consumer and enterprise telecommunications This involves building innovative mobile apps to enhance user experiences and enable seamless connectivity on-the-go Thrive in diverse roles like Full Stack Developer, Backend Developer, UI/UX Designer, DevOps Engineer, Cloud Engineer, Data Science Engineer, and Scrum Master; at a workplace that encourages you to freely share your bold and different ideas If you are passionate about technology and eager to make a difference, we want to hear from you! Apply now to join our dynamic team in Bengaluru We are seeking a talented Site Reliability Engineer (SRE) to join our team The ideal candidate will have a strong background in software engineering and systems administration, with a passion for building scalable and reliable systems As an SRE, you will collaborate with development and operations teams to ensure our services are reliable, performant, and highly available Key Responsibilities "Ensure the 24/7 operations and reliability of data services in our production GCP and on-premise Hadoop environments Collaborate with the data engineering development team to design, build, and maintain scalable, reliable, and secure data pipelines and systems Develop and implement monitoring, alerting, and incident response strategies to proactively identify and resolve issues before they impact production Drive the implementation of security and reliability best practices across the software development life cycle Contribute to the development of tools and automation to streamline the management and operation of data services Participate in on-call rotation and respond to incidents in a timely and effective manner Continuously evaluate and improve the reliability, scalability, and performance of data services" Technology Skills "4+ years of experience in site reliability engineering or a similar role Strong experience with Google Cloud Platform (GCP) services, including BigQuery, Dataflow, Pub/Sub, and Cloud Storage Experience with on-premise Hadoop environments and related technologies (HDFS, Hive, Spark, etc ) Proficiency in at least one programming language (Python, Scala, Java, Go, etc ) Required qualifications to be successful in this role Bachelors degree in computer science engineering, or related field 8 -10 years of experience as a SRE Proven experience as an SRE, DevOps engineer, or similar role Strong problem-solving skills and ability to work under pressure Excellent communication and collaboration skills Flexible to work in EST time zones ( 9-5 EST) Additional Information Job Type Full Time Work ProfileHybrid (Work from Office/ Remote) Years of Experience8-10 Years LocationBangalore What We Offer Competitive salaries and comprehensive health benefits Flexible work hours and remote work options Professional development and training opportunities A supportive and inclusive work environment
Posted 3 weeks ago
4.0 - 8.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Back At BCE Global Tech, immerse yourself in exciting projects that are shaping the future of both consumer and enterprise telecommunications This involves building innovative mobile apps to enhance user experiences and enable seamless connectivity on-the-go Thrive in diverse roles like Full Stack Developer, Backend Developer, UI/UX Designer, DevOps Engineer, Cloud Engineer, Data Science Engineer, and Scrum Master; at a workplace that encourages you to freely share your bold and different ideas If you are passionate about technology and eager to make a difference, we want to hear from you! Apply now to join our dynamic team in Bengaluru We are seeking a talented Site Reliability Engineer (SRE) to join our team The ideal candidate will have a strong background in software engineering and systems administration, with a passion for building scalable and reliable systems As an SRE, you will collaborate with development and operations teams to ensure our services are reliable, performant, and highly available Key Responsibilities "Ensure the 24/7 operations and reliability of data services in our production GCP and on-premise Hadoop environments Collaborate with the data engineering development team to design, build, and maintain scalable, reliable, and secure data pipelines and systems Develop and implement monitoring, alerting, and incident response strategies to proactively identify and resolve issues before they impact production Drive the implementation of security and reliability best practices across the software development life cycle Contribute to the development of tools and automation to streamline the management and operation of data services Participate in on-call rotation and respond to incidents in a timely and effective manner Continuously evaluate and improve the reliability, scalability, and performance of data services" Technology Skills 4+ years of experience in site reliability engineering or a similar role Strong experience with Google Cloud Platform (GCP) services, including BigQuery, Dataflow, Pub/Sub, and Cloud Storage Experience with on-premise Hadoop environments and related technologies (HDFS, Hive, Spark, etc ) Proficiency in at least one programming language (Python, Scala, Java, Go, etc ) Required Qualifications To Be Successful In This Role Bachelors degree in computer science engineering, or related field 8 -10 years of experience as a SRE Proven experience as an SRE, DevOps engineer, or similar role Strong problem-solving skills and ability to work under pressure Excellent communication and collaboration skills Flexible to work in EST time zones ( 9-5 EST) Additional Information Job Type Full Time Work ProfileHybrid (Work from Office/ Remote) Years of Experience8-10 Years LocationBangalore What We Offer Competitive salaries and comprehensive health benefits Flexible work hours and remote work options Professional development and training opportunities A supportive and inclusive work environment
Posted 3 weeks ago
4.0 - 8.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Back About UsBCE Global Tech is a dynamic and innovative company dedicated to pushing the boundaries of technology We are on a mission to modernize global connectivity, one connection at a time Our goal is to build the highway to the future of communications, media, and entertainment, emerging as a powerhouse within the technology landscape in India We bring ambitions to life through design thinking that bridges the gaps between people, devices, and beyond, fostering unprecedented customer satisfaction through technology At BCE Global Tech, we are guided by our core values of innovation, customer-centricity, and a commitment to progress We harness cutting-edge technology to provide business outcomes with positive societal impact Our team of thought-leaders is pioneering advancements in 5G, MEC, IoT, and cloud-native architecture We offer continuous learning opportunities, innovative projects, and a collaborative work environment that empowers our employees to grow and succeed Responsibilities Lead the migration of data pipelines from Hadoop to Google Cloud Platform (GCP) Design, develop, and maintain data workflows using Airflow and custom flow solutions Implement infrastructure as code using Terraform Develop and optimize data processing applications using Java Spark or Python Spark Utilize Cloud Run and Cloud Functions for serverless computing Manage containerized applications using Docker Understand and enhance existing Hadoop pipelines Write and execute unit tests to ensure code quality Deploy data engineering solutions in production environments Craft and optimize SQL queries for data manipulation and analysis Requirements 7-8 years of experience in data engineering or related fields Proven experience with GCP migration from Hadoop pipelines Proficiency in Airflow and custom flow solutions Strong knowledge of Terraform for infrastructure management Expertise in Java Spark or Python Spark Experience With Cloud Run And Cloud Functions Experience with Data flow, DateProc and Cloud monitoring tools in GCP Familiarity with Docker for container management Solid understanding of Hadoop pipelines Ability to write and execute unit tests Experience with deployments in production environments Strong SQL query skills Skills Excellent teamwork and collaboration abilities Quick learner with a proactive attitude Strong problem-solving skills and attention to detail Ability to work independently and as part of a team Effective communication skills Why Join Us Opportunity to work with cutting-edge technologies Collaborative and supportive work environment Competitive salary and benefits Career growth and development opportunities
Posted 3 weeks ago
4.0 - 9.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Data Transformation: Utilize Data Build Tool (dbt) to transform raw data into curated data models according to business requirements. Implement data transformations and aggregations to support analytical and reporting needs. Orchestration and Automation: Design and implement automated workflows using Google Cloud Composer to orchestrate data pipelines and ensure timely data delivery. Monitor and troubleshoot data pipelines, identifying and resolving issues proactively. Develop and maintain documentation for data pipelines and workflows. GCP Expertise: Leverage GCP services, including BigQuery, Cloud Storage, and Pub/Sub, to build a robust and scalable data platform. Optimize BigQuery performance and cost through efficient query design and data partitioning. Implement data security and access controls in accordance with banking industry standards. Collaboration and Communication: Collaborate with Solution Architect and Data Modeler to understand data requirements and translate them into technical solutions. Communicate effectively with team members and stakeholders, providing regular updates on project progress. Participate in code reviews and contribute to the development of best practices. Data Pipeline Development: Design, develop, and maintain scalable and efficient data pipelines using Google Cloud Dataflow to ingest data from various sources, including relational databases (RDBMS), data streams, and files. Implement data quality checks and validation processes to ensure data accuracy and consistency. Optimize data pipelines for performance and cost-effectiveness. Banking Domain Knowledge (Preferred): Understanding of banking data domains, such as customer data, transactions, and financial products. Familiarity with regulatory requirements and data governance standards in the banking industry. Required Experience: Bachelor's degree in computer science, Engineering, or a related field. ETL Knowledge. 4-9 years of experience in data engineering, with a focus on building data pipelines and data transformations. Strong proficiency in SQL and experience working with relational databases. Hands-on experience with Google Cloud Platform (GCP) services, including Dataflow, BigQuery, Cloud Composer, and Cloud Storage. Experience with data transformation tools, preferably Data Build Tool (dbt). Proficiency in Python or other scripting languages is a plus. Experience with data orchestration and automation. Strong problem-solving and analytical skills. Excellent communication and collaboration skills. Experience with data streams like Pub/Sub or similar. Experience in working with files such as CSV, JSON and Parquet. Primary Skills: GCP, Dataflow, BigQuery, Cloud Composer, Cloud Storage, Data Pipeline, Composer, SQL, DBT, DWH Concepts. Secondary Skills: Python, Banking Domain knowledge, pub/sub, Cloud certifications (e.g. Data engineer), Git or any other version control system.
Posted 3 weeks ago
3.0 - 7.0 years
10 - 20 Lacs
Noida, Gurugram, Delhi / NCR
Hybrid
Salary: 8 to 24 LPA Exp: 3 to 7 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Job Title: Senior Data Engineer Job Summary: We are looking for an experienced Senior Data Engineer with 5+ years of hands-on experience in cloud data engineering platforms, specifically AWS, Databricks, and Azure. The ideal candidate will play a critical role in designing, building, and maintaining scalable data pipelines and infrastructure to support our analytics and business intelligence initiatives. Key Responsibilities: Design, develop, and optimize scalable data pipelines using AWS services (e.g., S3, Glue, Redshift, Lambda). Build and maintain ETL/ELT workflows leveraging Databricks and Apache Spark for processing large datasets. Work extensively with Azure data services such as Azure Data Lake, Azure Synapse, Azure Data Factory, and Azure Databricks. Collaborate with data scientists, analysts, and stakeholders to understand data requirements and deliver high-quality data solutions. Ensure data quality, reliability, and security across multiple cloud platforms. Monitor and troubleshoot data pipelines, implement performance tuning, and optimize resource usage. Implement best practices for data governance, metadata management, and documentation. Stay current with emerging cloud data technologies and industry trends to recommend improvements. Required Qualifications: 5+ years of experience in data engineering with strong expertise in AWS , Databricks , and Azure cloud platforms. Hands-on experience with big data processing frameworks, particularly Apache Spark. Proficient in building complex ETL/ELT pipelines and managing data workflows. Strong programming skills in Python, Scala, or Java. Experience working with structured and unstructured data in cloud storage solutions. Knowledge of SQL and experience with relational and NoSQL databases. Familiarity with CI/CD pipelines and DevOps practices in cloud environments. Strong analytical and problem-solving skills with an ability to work independently and in teams. Preferred Skills: Experience with containerization and orchestration tools (Docker, Kubernetes). Familiarity with machine learning pipelines and tools. Knowledge of data modeling, data warehousing, and analytics architecture.
Posted 3 weeks ago
3.0 - 7.0 years
10 - 20 Lacs
Noida, Gurugram, Delhi / NCR
Hybrid
Salary: 8 to 24 LPA Exp: 3 to 7 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Job Profile: Experienced Data Engineer with a strong foundation in designing, building, and maintaining scalable data pipelines and architectures. Skilled in transforming raw data into clean, structured formats for analytics and business intelligence. Proficient in modern data tools and technologies such as SQL, T-SQL, Python, Databricks, and cloud platforms (Azure). Adept at data wrangling, modeling, ETL/ELT development, and ensuring data quality, integrity, and security. Collaborative team player with a track record of enabling data-driven decision-making across business units. As a Data engineer, Candidate will work on the assignments for one of our Utilities clients. Collaborating with cross-functional teams and stakeholders involves gathering data requirements, aligning business goals, and translating them into scalable data solutions. The role includes working closely with data analysts, scientists, and business users to understand needs, designing robust data pipelines, and ensuring data is accessible, reliable, and well-documented. Regular communication, iterative feedback, and joint problem-solving are key to delivering high-impact, data-driven outcomes that support organizational objectives. This position requires a proven track record of transforming processes, driving customer value, cost savings with experience in running end-to-end analytics for large-scale organizations. Design, build, and maintain scalable data pipelines to support analytics, reporting, and advanced modeling needs. Collaborate with consultants, analysts, and clients to understand data requirements and translate them into effective data solutions. Ensure data accuracy, quality, and integrity through validation, cleansing, and transformation processes. Develop and optimize data models, ETL workflows, and database architectures across cloud and on-premises environments. Support data-driven decision-making by delivering reliable, well-structured datasets and enabling self-service analytics. Provides seamless integration with cloud platforms (Azure), making it easy to build and deploy end-to-end data pipelines in the cloud Scalable clusters for handling large datasets and complex computations in Databricks, optimizing performance and cost management. Must to have Client Engagement Experience and collaboration with cross-functional teams Data Engineering background in Databricks Capable of working effectively as an individual contributor or in collaborative team environments Effective communication and thought leadership with proven record. Candidate Profile: Bachelors/masters degree in economics, mathematics, computer science/engineering, operations research or related analytics areas 3+ years’ experience must be in Data engineering. Hands on experience on SQL, Python, Databricks, cloud Platform like Azure etc. Prior experience in managing and delivering end to end projects Outstanding written and verbal communication skills Able to work in fast pace continuously evolving environment and ready to take up uphill challenges Is able to understand cross cultural differences and can work with clients across the globe.
Posted 3 weeks ago
5.0 - 10.0 years
8 - 18 Lacs
Noida
Remote
Well versed in C++ Multithreading programming Excellent programming skills using C++, Java, C# programming languages Familiarity with FIX protocol, market data distribution, order handling is a plus Strong command of spoken and written English
Posted 3 weeks ago
2.0 - 7.0 years
4 - 9 Lacs
Coimbatore
Work from Office
Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Google Cloud Machine Learning Services Good to have skills : GCP Dataflow, Google Pub/Sub, Google Dataproc Minimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are seeking a skilled GCP Data Engineer to join our dynamic team. The ideal candidate will design, build, and maintain scalable data pipelines and solutions on Google Cloud Platform (GCP). This role requires expertise in cloud-based data engineering and hands-on experience with GCP tools and services, ensuring efficient data integration, transformation, and storage for various business use cases.________________________________________ Roles & Responsibilities: Design, develop, and deploy data pipelines using GCP services such as Dataflow, BigQuery, Pub/Sub, and Cloud Storage. Optimize and monitor data workflows for performance, scalability, and reliability. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and implement solutions. Implement data security and governance measures, ensuring compliance with industry standards. Automate data workflows and processes for operational efficiency. Troubleshoot and resolve technical issues related to data pipelines and platforms. Document technical designs, processes, and best practices to ensure maintainability and knowledge sharing.________________________________________ Professional & Technical Skills:a) Must Have: Proficiency in GCP tools such as BigQuery, Dataflow, Pub/Sub, Cloud Composer, and Cloud Storage. Expertise in SQL and experience with data modeling and query optimization. Solid programming skills in Python ofor data processing and ETL development. Experience with CI/CD pipelines and version control systems (e.g., Git). Knowledge of data warehousing concepts, ELT/ETL processes, and real-time streaming. Strong understanding of data security, encryption, and IAM policies on GCP.b) Good to Have: Experience with Dialogflow or CCAI tools Knowledge of machine learning pipelines and integration with AI/ML services on GCP. Certifications such as Google Professional Data Engineer or Google Cloud Architect.________________________________________ Additional Information: - The candidate should have a minimum of 3 years of experience in Google Cloud Machine Learning Services and overall Experience is 3- 5 years - The ideal candidate will possess a strong educational background in computer science, mathematics, or a related field, along with a proven track record of delivering impactful data-driven solutions. Qualifications 15 years full time education
Posted 3 weeks ago
1.0 - 4.0 years
3 - 7 Lacs
Kolkata
Work from Office
Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : SAP Ariba Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Support Engineer, you will act as software detectives, providing a dynamic service that identifies and solves issues within multiple components of critical business systems. Your typical day will involve collaborating with various teams to troubleshoot and resolve software-related challenges, ensuring that business operations run smoothly and efficiently. You will engage in problem-solving activities, analyze system performance, and contribute to the continuous improvement of application support processes, all while maintaining a focus on delivering exceptional service to stakeholders. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor system performance and proactively address potential issues. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Ariba.- Strong understanding of application support processes and methodologies.- Experience with troubleshooting and resolving software issues.- Familiarity with system integration and data flow management.- Ability to work collaboratively in a team-oriented environment. Additional Information:- The candidate should have minimum 5 years of experience in SAP Ariba.- This position is based at our Kolkata office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : SAP BW/4HANA Data Modeling & Development Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. A typical day involves collaborating with cross-functional teams to gather insights, analyzing user needs, and translating them into functional specifications. You will engage in discussions to refine application designs and ensure alignment with business objectives, while also participating in testing and validation processes to guarantee that the applications meet the defined requirements effectively. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with stakeholders to gather and analyze requirements for application design.- Participate in the testing and validation of applications to ensure they meet business needs. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BW/4HANA Data Modeling & Development.- Good to have- SAP ABAP, CDP views- Strong understanding of data modeling concepts and best practices.- Experience with application design methodologies and tools.- Ability to analyze and interpret complex business requirements.- Familiarity with integration techniques and data flow management. Additional Information:- The candidate should have minimum 3 years of experience in SAP BW/4HANA Data Modeling & Development.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
7.0 - 12.0 years
13 - 18 Lacs
Pune
Work from Office
Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : Microsoft Power Business Intelligence (BI) Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application, model and design the application data structure, storage, and integration. You will play a crucial role in shaping the data architecture of the organization and ensuring seamless data flow. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead data architecture discussions and decisions- Develop data models and database design- Implement data governance policies Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Power Business Intelligence (BI)- Strong understanding of data modeling and database design- Experience with ETL processes and tools- Knowledge of data integration and data warehousing concepts- Hands-on experience with SQL and database management- Good To Have Skills: Experience with data visualization tools Additional Information:- The candidate should have a minimum of 7.5 years of experience in Microsoft Power Business Intelligence (BI)- This position is based at our Pune office- A 15 years full-time education is required Qualification 15 years full time education
Posted 3 weeks ago
7.0 - 12.0 years
13 - 18 Lacs
Bengaluru
Work from Office
Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : SAP HCM On Premise ABAP, SAP ABAP BOPF Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application, model and design the application data structure, storage, and integration. You will play a crucial role in shaping the data architecture of the project and ensuring seamless data flow. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead data governance initiatives to ensure data quality and integrity- Develop data models and database design for efficient data storage- Implement data security measures to protect sensitive information Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP HCM On Premise ABAP, SAP ABAP BOPF- Strong understanding of data modeling and database design- Experience in data integration and ETL processes- Knowledge of data governance and data security best practices- Hands-on experience with SAP HANA database management Additional Information:- The candidate should have a minimum of 7.5 years of experience in SAP HCM On Premise ABAP.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 3 weeks ago
5.0 - 9.0 years
14 - 19 Lacs
Chennai
Work from Office
Project description We are seeking a highly skilled Senior Power BI Developer with strong expertise in Power BI, SQL Server, and data modeling to join our Business Intelligence team. In this role, you will lead the design and development of interactive dashboards, robust data models, and data pipelines that empower business stakeholders to make informed decisions. You will work collaboratively with cross-functional teams and drive the standardization and optimization of our BI architecture. Responsibilities Power BI Dashboard Development (UI Dashboards) Design, develop, and maintain visually compelling, interactive Power BI dashboards aligned with business needs. Collaborate with business stakeholders to gather requirements, develop mockups, and refine dashboard UX. Implement advanced Power BI features like bookmarks, drill-throughs, dynamic tooltips, and DAX calculations. Conduct regular UX/UI audits and performance tuning on reports. Data Modeling in SQL Server & Dataverse Build and manage scalable, efficient data models in Power BI, Dataverse, and SQL Server. Apply best practices in dimensional modeling (star/snowflake schema) to support analytical use cases. Ensure data consistency, accuracy, and alignment across multiple sources and business areas. Perform optimization of models and queries for performance and load times. Power BI Dataflows & ETL Pipelines Develop and maintain reusable Power BI Dataflows for centralized data transformations. Create ETL processes using Power Query, integrating data from diverse sources including SQL Server, Excel, APIs, and Dataverse. Automate data refresh schedules and monitor dependencies across datasets and reports. Ensure efficient data pipeline architecture for reuse, scalability, and maintenance. Skills Must have Experience6+ years in Business Intelligence or Data Analytics with a strong focus on Power BI and SQL Server. Technical Skills: Expert-level Power BI development, including DAX, custom visuals, and report optimization. Strong knowledge of SQL (T-SQL) and relational database design. Experience with Dataverse and Power Platform integration. Proficiency in Power Query, Dataflows, and ETL development. ModelingProven experience in dimensional modeling, star/snowflake schema, and performance tuning. Data IntegrationSkilled in connecting and transforming data from various sources, including APIs, Excel, and cloud data services. CollaborationAbility to work with stakeholders to define KPIs, business logic, and dashboard UX. Nice to have N/A OtherLanguagesEnglishC1 Advanced SenioritySenior
Posted 3 weeks ago
3.0 - 5.0 years
10 - 13 Lacs
Chennai
Work from Office
3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)
Posted 4 weeks ago
5.0 - 7.0 years
15 - 17 Lacs
Chennai
Work from Office
Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)
Posted 4 weeks ago
5.0 - 10.0 years
15 - 20 Lacs
Chennai
Work from Office
Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow) Mandatory Key Skills Google Cloud Platform,GCS,DataProc,Big Query,Data Flow,Composer,Data Processing,Java*
Posted 4 weeks ago
3.0 - 8.0 years
10 - 18 Lacs
Guwahati
Work from Office
Design and implement scalable data architectures to optimize data flow and analytics capabilities Develop ETL pipelines, data warehouses, and real-time data processing systems Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery Work closely with data scientists to enhance machine learning models with structured and unstructured data Prior experience in handling large-scale datasets is preferred
Posted 4 weeks ago
3.0 - 8.0 years
10 - 18 Lacs
Kochi
Work from Office
Design and implement scalable data architectures to optimize data flow and analytics capabilities Develop ETL pipelines, data warehouses, and real-time data processing systems Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery Work closely with data scientists to enhance machine learning models with structured and unstructured data Prior experience in handling large-scale datasets is preferred
Posted 4 weeks ago
3.0 - 8.0 years
10 - 18 Lacs
Kanpur
Work from Office
Design and implement scalable data architectures to optimize data flow and analytics capabilities Develop ETL pipelines, data warehouses, and real-time data processing systems Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery Work closely with data scientists to enhance machine learning models with structured and unstructured data Prior experience in handling large-scale datasets is preferred
Posted 4 weeks ago
5.0 - 8.0 years
17 - 20 Lacs
Kolkata
Work from Office
Key Responsibilities Architect and implement scalable data solutions using GCP (BigQuery, Dataflow, Pub/Sub, Cloud Storage, Composer, etc.) and Snowflake. Lead the end-to-end data architecture including ingestion, transformation, storage, governance and consumption layers. Collaborate with business stakeholders, data scientists and engineering teams to define and deliver enterprise data strategy. Design robust data pipelines (batch and real-time) ensuring high data quality, security and availability. Define and enforce data governance, data cataloging and metadata management best practices. Evaluate and select appropriate tools and technologies to optimize data architecture and cost efficiency. Mentor junior architects and data engineers, guiding them on design best practices and technology standards. Collaborate with DevOps teams to ensure smooth CI/CD pipelines and infrastructure automation for data Skills & Qualifications : 3+ years of experience in data architecture, data engineering, or enterprise data platform roles. 3+ years of hands-on experience in Google Cloud Platform (especially BigQuery, Dataflow, Cloud Composer, Data Catalog). 3+ years of experience designing and implementing Snowflake-based data solutions. Deep understanding of modern data architecture principles (Data Lakehouse, ELT/ETL, Data Mesh, etc.). Proficient in Python, SQL and orchestration tools like Airflow / Cloud Composer. Experience in data modeling (3NF, Star, Snowflake schemas) and designing data marts and warehouses. Strong understanding of data privacy, compliance (GDPR, HIPAA) and security principles in cloud environments. Familiarity with tools like dbt, Apache Beam, Looker, Tableau, or Power BI is a plus. Excellent communication and stakeholder management skills. GCP or Snowflake certification preferred (e.g., GCP Professional Data Engineer, SnowPro Qualifications : Experience working with hybrid or multi-cloud data strategies. Exposure to ML/AI pipelines and support for data science workflows. Prior experience in leading architecture reviews, PoCs and technology roadmaps
Posted 4 weeks ago
10.0 - 15.0 years
11 - 15 Lacs
Jhagadia
Work from Office
Develop, implement, and maintain the organization's MIS to ensure accurate and real-time reporting of key business metrics. Oversee the preparation and distribution of daily, weekly, and monthly reports to various departments and senior management. Ensure data accuracy, integrity, and consistency across all reporting platforms. Design and maintain dashboards for business performance monitoring. Analyze data trends and provide insights to management for informed decision-making. Establish and maintain cost accounting systems and procedures for accurate tracking of material, labor, and overhead costs. Review and update cost standards, analyzing variances and taking corrective actions when necessary. Collaborate with other departments to monitor and control project costs, ensuring alignment with budget and financial goals. Perform cost analysis and prepare cost reports to monitor financial performance and support pricing decisions. Conduct regular audits to ensure compliance with costing policies and industry standards. Provide regular cost analysis reports, highlighting variances between actual and budgeted figures, and recommend corrective actions. Support financial forecasting and budgeting processes by providing relevant data and insights. Assist in month-end and year-end closing processes by ensuring accurate costing and reporting entries. Review profitability analysis reports and identify areas for cost optimization.
Posted 4 weeks ago
5.0 - 7.0 years
15 - 20 Lacs
Chennai
Work from Office
Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow) Mandatory Key Skills agile development,Data Processing,Python,Shell Script,SQL,Google Cloud Platform*,GCS*,DataProc*,Big Query*,Data Flow*
Posted 4 weeks ago
7.0 - 9.0 years
19 - 22 Lacs
Chennai
Work from Office
This role is for 7+ years experienced Software Engineer with data engineering knowledge and following skill set. 1.) End 2 End Full Stack 2.) GCP - Services like Big Query, Astronomer, Terraform, Airflow, Data flow, GCP Architecture 3.) Python Fullstack Java with Cloud Mandatory Key Skills Software Engineering,Big Query,Terraform,Airflow,Data flow,GCP Architecture,Java,Cloud,data engineering*
Posted 4 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
19947 Jobs | Dublin
Wipro
9475 Jobs | Bengaluru
EY
7894 Jobs | London
Accenture in India
6317 Jobs | Dublin 2
Amazon
6141 Jobs | Seattle,WA
Uplers
6077 Jobs | Ahmedabad
Oracle
5820 Jobs | Redwood City
IBM
5736 Jobs | Armonk
Tata Consultancy Services
3644 Jobs | Thane
Capgemini
3598 Jobs | Paris,France