Jobs
Interviews

20 Aws Databricks Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You are a strategic thinker passionate about driving solutions in External Reporting. You have found the right team. As a Data Controllers & Reporting Analyst within the Firmwide Regulatory Reporting & Analysis team, you will be responsible for collaborating on production processing and reporting activities, with a particular focus on U.S. Regulatory Reports such as FR Y-9C, Call Report, and CCAR. Your role will involve ensuring the accuracy and completeness of our regulatory submissions. As part of the Corporate Finance division, our team is tasked with executing the Firm's regulatory reporting requirements to U.S. regulators, ensuring consistency and accuracy in reporting and capital stress testing submissions. Being a part of the diverse global DCR team within FRRA, you will be committed to maintaining data completeness and accuracy across 25+ jurisdictions. Your mission will involve data sourcing, validations, adjustment processing, and reconciliations to support our financial reporting platform. Manage BAU activities, including data sourcing, data validation and completeness, adjustments processing, and performing reconciliations. Execute overall operating model and procedures for functional areas in the reporting space. Manage client relations, communications, and presentations. Support business users of the FRI application with user queries and issue resolutions. Identify and execute process improvements to the existing operating model, tools, and procedures. Interact with Controllers, Report owners, and RFT (Risk & Finance Technology) partners. Act as an interface with Control partners, ensuring compliance with risk and controls policies. Escalate issues as needed to the appropriate team(s) and management. Partner with projects team through the full project life cycles. Lead programs and initiatives for reporting automation and operating model optimization. Required Qualifications, Skills, and Capabilities: - Bachelors degree in Accounting, Finance, or a related discipline - Strong oral and written communication with the ability to effectively partner with managers and stakeholders at all levels - Strong working knowledge of MS office applications (MS Excel, MS Word, MS PowerPoint), specifically with reconciliations, summarizing and formatting data - Experience using data management & visualization tools in a reporting setting: AWS Databricks, Alteryx, SQL, Tableau, Visio - Enthusiastic, self-motivated, effective under pressure and strong work ethic and keen attention to detail and accuracy - Aptitude and desire to learn quickly, be flexible, and think strategically - Client & business focused; able to work collaboratively and build strong partnerships with clients and colleagues at all levels Preferred Qualifications, Skills, and Capabilities: - Familiarity with US Regulatory reporting (E.g. Y9C, Call, CCAR etc.), controllership functions, banking & brokerage products, and US GAAP accounting principles - Control mindset and exposure to establishing or enhancing existing controls - Strong verbal and written communication skill with the ability to present information at varying levels of detail depending on the audience - Strong process and project management skills,

Posted 1 day ago

Apply

5.0 - 8.0 years

15 - 25 Lacs

Bengaluru

Work from Office

Job Title :AWS Data Engineer Location :Bangalore Notice Period : Immediate to 60 Days Preferred. Job Description: We are seeking skilled and dynamic Cloud Data Engineers specializing in AWS, Databricks. The ideal candidate will have a strong background in data engineering, with a focus on data ingestion, transformation, and warehousing. They should also possess excellent knowledge of PySpark or Spark, and a proven ability to optimize performance in Spark job executions. Key Responsibilities: - Design, build, and maintain scalable data pipelines for a variety of cloud platforms including AWS. - Implement data ingestion and transformation processes to facilitate efficient data warehousing. - Utilize cloud services to enhance data processing capabilities: - AWS: Glue, Athena, Lambda, Redshift, Step Functions, DynamoDB, SNS. . - Optimize Spark job performance to ensure high efficiency and reliability. - Stay proactive in learning and implementing new technologies to improve data processing frameworks. - Collaborate with cross-functional teams to deliver robust data solutions. - Work on Spark Streaming for real-time data processing as necessary. Qualifications: - 5-8 years of experience in data engineering with a strong focus on cloud environments. - Proficiency in PySpark or Spark is mandatory. - Proven experience with data ingestion, transformation, and data warehousing. - In-depth knowledge and hands-on experience with cloud services(AWS): - Demonstrated ability in performance optimization of Spark jobs. - Strong problem-solving skills and the ability to work independently as well as in a team. - Cloud Certification (AWS) is a plus. - Familiarity with Spark Streaming is a bonus.

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You are a strategic thinker who is passionate about driving solutions in External Reporting. You have found the right team. As a Data Controllers & Reporting Analyst within the Firmwide Regulatory Reporting & Analysis team, you will collaborate on production processing and reporting activities, focusing on U.S. Regulatory Reports such as FR Y-9C, Call Report, and CCAR. Your responsibilities will include ensuring the accuracy and completeness of regulatory submissions. Working in the Corporate Finance division, your team is responsible for executing the Firm's regulatory reporting requirements to U.S. regulators, ensuring consistency and accuracy in reporting and capital stress testing submissions. As a part of the diverse global DCR team within FRRA, you are committed to maintaining data completeness and accuracy across 25+ jurisdictions. Your mission involves data sourcing, validations, adjustment processing, and reconciliations to support the financial reporting platform. Manage BAU activities, including data sourcing, validation, completeness, adjustments processing, and reconciliations. Execute the overall operating model and procedures for functional areas in the reporting space. Manage client relations, communications, and presentations. Support business users of the FRI application with user queries and issue resolutions. Identify and execute process improvements to the existing operating model, tools, and procedures. Interact with Controllers, Report owners, and RFT (Risk & Finance Technology) partners. Act as an interface with Control partners, ensuring compliance with risk and controls policies. Escalate issues as needed to the appropriate team(s) and management. Partner with projects team through the full project life cycles. Lead programs and initiatives for reporting automation and operating model optimization. Required Qualifications, Skills, and Capabilities: - Bachelor's degree in Accounting, Finance, or a related discipline - Strong oral and written communication skills with the ability to effectively partner with managers and stakeholders at all levels - Strong working knowledge of MS Office applications (MS Excel, MS Word, MS PowerPoint), particularly with reconciliations, summarizing, and formatting data - Experience using data management & visualization tools in a reporting setting: AWS Databricks, Alteryx, SQL, Tableau, Visio - Enthusiastic, self-motivated, effective under pressure, strong work ethic, keen attention to detail and accuracy - Aptitude and desire to learn quickly, be flexible, and think strategically - Client & business-focused; able to work collaboratively and build strong partnerships with clients and colleagues at all levels Preferred Qualifications, Skills, and Capabilities: - Familiarity with US Regulatory reporting (E.g. Y9C, Call, CCAR etc.), controllership functions, banking & brokerage products, and US GAAP accounting principles - Control mindset and exposure to establishing or enhancing existing controls - Strong verbal and written communication skills with the ability to present information at varying levels of detail depending on the audience - Strong process and project management skills,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

maharashtra

On-site

You are a strategic thinker who is passionate about driving solutions in Regulatory reporting. You have found the right team. As a Regulatory reporting Associate in our Finance team, you will spend each day defining, refining, and delivering set goals for our firm. As a Firmwide Regulatory Reporting & Analysis (FRRA) Associate within Corporate Finance, you will play a crucial role in collaborating across the organization to provide strategic analysis, oversight, and coordination of production processing and reporting activities, including strategic initiatives for US Regulatory Reports such as FR Y-9C, Call Report, and CCAR. Our Firmwide Regulatory Reporting & Analysis (FRRA) team resides within Corporate Finance and is responsible for executing and delivering the Firm's regulatory reporting requirements to U.S. regulators. The team has end-to-end responsibility for U.S. regulatory reporting and capital stress testing, including the design, implementation, and oversight of execution, analysis, and control and governance frameworks. Your mandate will involve determining the appropriate investment in people, processes, and technology to enhance the accuracy, completeness, and consistency of the Firm's U.S. regulatory reporting and capital stress testing submissions, as well as implementing new requirements and guidelines as they are published. Ensure BAU activities by sourcing data, validating completeness, processing adjustments, and performing reconciliations. Execute the overall operating model and procedures for functional areas in the reporting space. Manage client relations, communications, and presentations effectively. Support business users of the FRI application by addressing user queries and resolving issues. Identify and execute process improvements to enhance the existing operating model, tools, and procedures. Interact with Controllers, Report owners, and RFT (Risk & Finance Technology) partners. Act as an interface with Control partners, ensuring compliance with risk and controls policies. Escalate issues as needed to the appropriate team(s) and management. Partner with the projects team through the full project life cycles. Lead programs and initiatives for reporting automation and operating model optimization. Required Qualifications, Skills, and Capabilities: - Bachelors degree in Accounting, Finance, or a related discipline - 8+ years of financial services or related experience - Strong oral and written communication with the ability to effectively partner with managers and stakeholders at all levels - Strong working knowledge of MS office applications (MS Excel, MS Word, MS PowerPoint), specifically with reconciliations, summarizing and formatting data - Experience using data management & visualization tools in a reporting setting: AWS Databricks, Alteryx, SQL, Tableau, Visio - Familiarity with US Regulatory reporting (E.g. Y9C, Call, CCAR etc.), controllership functions, banking & brokerage products, and US GAAP accounting principles - Control mindset and exposure to establishing or enhancing existing controls - Aptitude and desire to learn quickly, be flexible, and think strategically Preferred Qualifications, Skills, and Capabilities: - Strong verbal and written communication skill with the ability to present information at varying levels of detail depending on the audience - Strong process and project management skills - Enthusiastic, self-motivated, effective under pressure and strong work ethic and keen attention to detail and accuracy - Client & business focused; able to work collaboratively and build strong partnerships with clients and colleagues at all levels,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

You are a strategic thinker passionate about driving solutions in regulatory reporting and analysis. You have found the right team. As a Firmwide Regulatory Reporting & Analysis (FRRA) Associate within Corporate Finance, you will play a crucial role in defining, refining, and achieving set goals for our firm. You will collaborate across the organization to provide strategic analysis, oversight, and coordination of production processing and reporting activities, including strategic initiatives for US Regulatory Reports such as FR Y-9C, Call Report, and CCAR. The FRRA team is responsible for executing and delivering the Firm's regulatory reporting requirements to U.S. regulators, with end-to-end responsibility for U.S. regulatory reporting and capital stress testing. This includes the design, implementation, and oversight of execution, analysis, and control and governance frameworks. Your mandate will involve determining the appropriate investment in people, processes, and technology to enhance the accuracy, completeness, and consistency of the Firm's U.S. regulatory reporting and capital stress testing submissions, as well as implementing new requirements and guidelines as they are published. Our Firmwide Regulatory Reporting & Analysis (FRRA) team resides within Corporate Finance and is responsible for executing and delivering the Firm's regulatory reporting requirements to U.S. regulators. The team has end-to-end responsibility for U.S. regulatory reporting and capital stress testing, including the design, implementation, and oversight of execution, analysis, and control and governance frameworks. Our mandate includes determining the appropriate investment in people, processes, and technology to improve the accuracy, completeness, and consistency of the Firm's U.S. regulatory reporting and capital stress testing submissions, as well as implementing new requirements and guidelines as they are published. Job Responsibilities: - Ensure BAU activities by sourcing data, validating completeness, processing adjustments, and performing reconciliations. - Execute overall operating model and procedures for functional areas in the reporting space. - Manage client relations, communications, and presentations effectively. - Support business users of the FRI application by addressing user queries and resolving issues. - Identify and execute process improvements to enhance the existing operating model, tools, and procedures. - Interact with Controllers, Report owners, and RFT (Risk & Finance Technology) partners. - Act as an interface with Control partners, ensuring compliance with risk and controls policies. - Escalate issues as needed to the appropriate team(s) and management. - Partner with projects team through the full project life cycles. - Lead programs and initiatives for reporting automation and operating model optimization. Required Qualifications, Skills, and Capabilities: - Bachelor's degree in Accounting, Finance, or a related discipline - 8+ years of financial services or related experience - Strong oral and written communication with the ability to effectively partner with managers and stakeholders at all levels - Strong working knowledge of MS office applications (MS Excel, MS Word, MS PowerPoint), specifically with reconciliations, summarizing and formatting data - Experience using data management & visualization tools in a reporting setting: AWS Databricks, Alteryx, SQL, Tableau, Visio Preferred Qualifications, Skills, and Capabilities: - Familiarity with US Regulatory reporting (E.g. Y9C, Call, CCAR etc.), controllership functions, banking & brokerage products, and US GAAP accounting principles - Control mindset and exposure to establishing or enhancing existing controls - Strong verbal and written communication skill with the ability to present information at varying levels of detail depending on the audience - Strong process and project management skills - Enthusiastic, self-motivated, effective under pressure and strong work ethic and keen attention to detail and accuracy - Aptitude and desire to learn quickly, be flexible, and think strategically - Client & business focused; able to work collaboratively and build strong partnerships with clients and colleagues at all levels,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

You are a strategic thinker passionate about driving solutions in regulatory reporting and analysis. You have found the right team. As a Firmwide Regulatory Reporting & Analysis (FRRA) Associate within Corporate Finance, you will play a crucial role in defining, refining, and achieving set goals for our firm. You will collaborate across the organization to provide strategic analysis, oversight, and coordination of production processing and reporting activities, including strategic initiatives for US Regulatory Reports such as FR Y-9C, Call Report, and CCAR. The FRRA team is responsible for executing and delivering the Firms regulatory reporting requirements to U.S. regulators, with end-to-end responsibility for U.S. regulatory reporting and capital stress testing. This includes the design, implementation, and oversight of execution, analysis, and control and governance frameworks. Your mandate will involve determining the appropriate investment in people, processes, and technology to enhance the accuracy, completeness, and consistency of the Firms U.S. regulatory reporting and capital stress testing submissions, as well as implementing new requirements and guidelines as they are published. Our Firmwide Regulatory Reporting & Analysis (FRRA) team resides within Corporate Finance and is responsible for executing and delivering the Firms regulatory reporting requirements to U.S. regulators. The team has end-to-end responsibility for U.S. regulatory reporting and capital stress testing, including the design, implementation, and oversight of execution, analysis, and control and governance frameworks. Our mandate includes determining the appropriate investment in people, processes, and technology to improve the accuracy, completeness, and consistency of the Firms U.S. regulatory reporting and capital stress testing submissions, as well as implementing new requirements and guidelines as they are published. Job Responsibilities: - Ensure BAU activities by sourcing data, validating completeness, processing adjustments, and performing reconciliations. - Execute overall operating model and procedures for functional areas in the reporting space. - Manage client relations, communications, and presentations effectively. - Support business users of the FRI application by addressing user queries and resolving issues. - Identify and execute process improvements to enhance the existing operating model, tools, and procedures. - Interact with Controllers, Report owners, and RFT (Risk & Finance Technology) partners. - Act as an interface with Control partners, ensuring compliance with risk and controls policies. - Escalate issues as needed to the appropriate team(s) and management. - Partner with projects team through the full project life cycles. - Lead programs and initiatives for reporting automation and operating model optimization. Required Qualifications, Skills, and Capabilities: - Bachelors degree in Accounting, Finance, or a related discipline - 8+ years of financial services or related experience - Strong oral and written communication with the ability to effectively partner with managers and stakeholders at all levels - Strong working knowledge of MS office applications (MS Excel, MS Word, MS PowerPoint), specifically with reconciliations, summarizing and formatting data - Experience using data management & visualization tools in a reporting setting: AWS Databricks, Alteryx, SQL, Tableau, Visio Preferred Qualifications, Skills, and Capabilities: - Familiarity with US Regulatory reporting (E.g. Y9C, Call, CCAR etc.), controllership functions, banking & brokerage products, and US GAAP accounting principles - Control mindset and exposure to establishing or enhancing existing controls - Strong verbal and written communication skill with the ability to present information at varying levels of detail depending on the audience - Strong process and project management skills - Enthusiastic, self-motivated, effective under pressure and strong work ethic and keen attention to detail and accuracy - Aptitude and desire to learn quickly, be flexible, and think strategically - Client & business focused; able to work collaboratively and build strong partnerships with clients and colleagues at all levels,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

12 - 18 Lacs

Noida

Work from Office

General Roles & Responsibilities: Technical Leadership: Demonstrate leadership, and ability to guide business and technology teams in adoption of best practices and standards Design & Development: Design, develop, and maintain robust, scalable, and high-performance data estate Architecture: Architect and design robust data solutions that meet business requirements & include scalability, performance, and security. Quality: Ensure the quality of deliverables through rigorous reviews, and adherence to standards. Agile Methodologies: Actively participate in agile processes, including planning, stand-ups, retrospectives, and backlog refinement. Collaboration: Work closely with system architects, data engineers, data scientists, data analysts, cloud engineers and other business stakeholders to determine optimal solution & architecture that is future-proof too. Innovation: Stay updated with the latest industry trends and technologies, and drive continuous improvement initiatives within the development team. Documentation: Create and maintain technical documentation, including design documents, and architectural user guides. Technical Responsibilities: Optimize data pipelines for performance and efficiency. Work with Databricks clusters and configuration management tools. Use appropriate tools in the cloud data lake development and deployment. Developing/implementing cloud infrastructure to support current and future business needs. Provide technical expertise and ownership in the diagnosis and resolution of issues. Ensure all cloud solutions exhibit a higher level of cost efficiency, performance, security, scalability, and reliability. Manage cloud data lake development and deployment on AWSDatabricks. Manage and create workspaces, configure cloud resources, view usage data, and manage account identities, settings, and subscriptions in Databricks Required Technical Skills: Experience & Proficiency with Databricks platform - Delta Lake storage, Spark (PySpark, Spark SQL). Must be well versed with Databricks Lakehouse, Unity Catalog concept and its implementation in enterprise environments. Familiarity of data design pattern - medallion architecture to organize data in a Lakehouse. Experience & Proficiency with AWS Data Services S3, Glue, Athena, Redshift etc.| Airflow scheduling Proficiency in SQL and experience with relational databases. Proficiency in at least one programming language (e.g., Python, Java) for data processing and scripting. Experience with DevOps practices - AWS DevOps for CI/CD, Terraform/CDK for infrastructure as code Good understanding of data principles, Cloud Data Lake design & development including data ingestion, data modeling and data distribution. Jira: Proficient in using Jira for managing projects and tracking progress. Other Skills: Strong communication and interpersonal skills. Collaborate with data stewards, data owners, and IT teams for effective implementation Understanding of business processes and terminology preferably Logistics Experienced with Scrum and Agile Methodologies Qualification Bachelors degree in information technology or a related field. Equivalent experience may be considered. Overall experience of 8-12 years in Data Engineering Mandatory Competencies Data Science and Machine Learning - Data Science and Machine Learning - Databricks Data on Cloud - Azure Data Lake (ADL) Agile - Agile Data Analysis - Data Analysis Big Data - Big Data - Pyspark Data on Cloud - AWS S3 Data on Cloud - Redshift ETL - ETL - AWS Glue Python - Python DevOps - CI/CD Beh - Communication and collaboration Cloud - Azure - Azure Data Factory (ADF), Azure Databricks, Azure Data Lake Storage, Event Hubs, HDInsight Database - Database Programming - SQL Agile - Agile - SCRUM QA/QE - QA Analytics - Data Analysis Cloud - AWS - AWS S3, S3 glacier, AWS EBS Cloud - AWS - Tensorflow on AWS, AWS Glue, AWS EMR, Amazon Data Pipeline, AWS Redshift Programming Language - Python - Python Shell Development Tools and Management - Development Tools and Management - CI/CD Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate

Posted 2 weeks ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Hyderabad

Hybrid

Job Description We are looking for an experienced Senior Data Engineer with a strong foundation in Python, SQL, and Spark , and hands-on expertise in AWS, Databricks . In this role, you will build and maintain scalable data pipelines and architecture to support analytics, data science, and business intelligence initiatives. Youll work closely with cross-functional teams to drive data reliability, quality, and performance. Responsibilities: Design, develop, and optimize scalable data pipelines using Databricks in AWS such as Glue, S3, Lambda, EMR, Databricks notebooks, workflows and jobs. Building data lake in WS Databricks. Build and maintain robust ETL/ELT workflows using Python and SQL to handle structured and semi-structured data. Develop distributed data processing solutions using Apache Spark or PySpark . Partner with data scientists and analysts to provide high-quality, accessible, and well-structured data. Ensure data quality, governance, security, and compliance across pipelines and data stores. Monitor, troubleshoot, and improve the performance of data systems and pipelines. Participate in code reviews and help establish engineering best practices. Mentor junior data engineers and support their technical development. Requirements Bachelor's or master's degree in computer science, Engineering, or a related field. 5+ years of hands-on experience in data engineering , with at least 2 years working with AWS Databricks . Strong programming skills in Python for data processing and automation. Advanced proficiency in SQL for querying and transforming large datasets. Deep experience with Apache Spark/PySpark in a distributed computing environment. Solid understanding of data modelling, warehousing, and performance optimization techniques. Proficiency with AWS services such as Glue , S3 , Lambda and EMR . Experience with version control Git or Code commit Experience in any workflow orchestration like Airflow, AWS Step funtions is a plus

Posted 3 weeks ago

Apply

7.0 - 12.0 years

15 - 30 Lacs

Hyderabad

Hybrid

Job Title: Lead Data Engineer Job Summary The Lead Data Engineer will provide technical expertise in analysis, design, development, rollout and maintenance of data integration initiatives. This role will contribute to implementation methodologies and best practices, as well as work on project teams to analyse, design, develop and deploy business intelligence / data integration solutions to support a variety of customer needs. This position oversees a team of Data Integration Consultants at various levels, ensuring their success on projects, goals, trainings and initiatives though mentoring and coaching. Provides technical expertise in needs identification, data modelling, data movement and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective whilst leveraging best fit technologies (e.g., cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges Works with stakeholders to identify and define self-service analytic solutions, dashboards, actionable enterprise business intelligence reports and business intelligence best practices. Responsible for repeatable, lean and maintainable enterprise BI design across organizations. Effectively partners with client team. Leadership not only in the conventional sense, but also within a team we expect people to be leaders. Candidate should elicit leadership qualities such as Innovation, Critical thinking, optimism/positivity, Communication, Time Management, Collaboration, Problem-solving, Acting Independently, Knowledge sharing and Approachable. Responsibilities: Design, develop, test, and deploy data integration processes (batch or real-time) using tools such as Microsoft SSIS, Azure Data Factory, Databricks, Matillion, Airflow, Sqoop, etc. Create functional & technical documentation e.g. ETL architecture documentation, unit testing plans and results, data integration specifications, data testing plans, etc. Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs. Perform data analysis to validate data models and to confirm ability to meet business needs. May serve as project or DI lead, overseeing multiple consultants from various competencies Stays current with emerging and changing technologies to best recommend and implement beneficial technologies and approaches for Data Integration Ensures proper execution/creation of methodology, training, templates, resource plans and engagement review processes Coach team members to ensure understanding on projects and tasks, providing effective feedback (critical and positive) and promoting growth opportunities when appropriate. Coordinate and consult with the project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels Architect, design, develop and set direction for enterprise self-service analytic solutions, business intelligence reports, visualisations and best practice standards. Toolsets include but not limited to: SQL Server Analysis and Reporting Services, Microsoft Power BI, Tableau and Qlik. Work with report team to identify, design and implement a reporting user experience that is consistent and intuitive across environments, across report methods, defines security and meets usability and scalability best practices. Required Qualifications: 10 Years industry implementation experience with data integration tools such as AWS services Redshift, Athena, Lambda, Glue, S3, ETL, etc. 5-8 years of management experience required 5-8 years consulting experience preferred Minimum of 5 years of data architecture, data modelling or similar experience Bachelor’s degree or equivalent experience, Master’s Degree Preferred Strong data warehousing, OLTP systems, data integration and SDLC Strong experience in orchestration & working experience cloud native / 3rd party ETL data load orchestration Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.) Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP) Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms Strong databricks experience required to create notebooks in pyspark Experience using major data modelling tools (examples: ERwin, ER/Studio, PowerDesigner, etc.) Experience with major database platforms (e.g. SQL Server, Oracle, Azure Data Lake, Hadoop, Azure Synapse/SQL Data Warehouse, Snowflake, Redshift etc.) Strong experience in orchestration & working experience in either Data Factory or HDInsight or Data Pipeline or Cloud composer or Similar Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.) Understanding of modern data warehouse capabilities and technologies such as real-time, cloud, Big Data. Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP) Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms 3-5 years’ development experience in decision support / business intelligence environments utilizing tools such as SQL Server Analysis and Reporting Services, Microsoft’s Power BI, Tableau, looker etc. Preferred Skills & Experience: Knowledge and working experience with Data Integration processes, such as Data Warehousing, EAI, etc. Experience in providing estimates for the Data Integration projects including testing, documentation, and implementation Ability to Analyse business requirements as they relate to the data movement and transformation processes, research, evaluation and recommendation of alternative solutions. Ability to provide technical direction to other team members including contractors and employees. Ability to contribute to conceptual data modelling sessions to accurately define business processes, independently of data structures and then combines the two together. Proven experience leading team members, directly or indirectly, in completing high-quality major deliverables with superior results Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM. Can create documentation and presentations such that the they “stand on their own” Can advise sales on evaluation of Data Integration efforts for new or existing client work. Can contribute to internal/external Data Integration proof of concepts. Demonstrates ability to create new and innovative solutions to problems that have previously not been encountered. Ability to work independently on projects as well as collaborate effectively across teams Must excel in a fast-paced, agile environment where critical thinking and strong problem solving skills are required for success Strong team building, interpersonal, analytical, problem identification and resolution skills Experience working with multi-level business communities Can effectively utilise SQL and/or available BI tool to validate/elaborate business rules. Demonstrates an understanding of EDM architectures and applies this knowledge in collaborating with the team to design effective solutions to business problems/issues. Effectively influences and, at times, oversees business and data analysis activities to ensure sufficient understanding and quality of data. Demonstrates a complete understanding of and utilises DSC methodology documents to efficiently complete assigned roles and associated tasks. Deals effectively with all team members and builds strong working relationships/rapport with them. Understands and leverages a multi-layer semantic model to ensure scalability, durability, and supportability of the analytic solution. Understands modern data warehouse concepts (real-time, cloud, Big Data) and how to enable such capabilities from a reporting and analytic stand-point. Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM.

Posted 1 month ago

Apply

6.0 - 10.0 years

15 - 20 Lacs

Pune, Bengaluru, Delhi / NCR

Work from Office

What impact will you make? Every day, your work will make an impact that matters, while you thrive in a dynamic culture of inclusion, collaboration and high performance. As the undisputed leader in professional services, Deloitte is where you will find unrivaled opportunities to succeed and realize your full potential The Team Deloittes AI&D practice can help you uncover and unlock the value buried deep inside vast amounts of data. Our global network provides strategic guidance and implementation services to help companies manage data from disparate sources and convert it into accurate, actionable information that can support fact-driven decision-making and generate an insight-driven advantage. Our practice addresses the continuum of opportunities in business intelligence & visualization, data management, performance management and next-generation analytics and technologies, including big data, cloud, cognitive and machine learning. Work youll do Location: Bangalore/Mumbai/Pune/Delhi/Chennai/Hyderabad/Kolkata Roles: Databricks Data Engineering Senior Consultant We are seeking highly skilled Databricks Data Engineers to join our data modernization team. You will play a pivotal role in designing, developing, and maintaining robust data solutions on the Databricks platform. Your experience in data engineering, along with a deep understanding of Databricks, will be instrumental in building solutions to drive data-driven decision-making across a variety of customers. Mandatory Skills: Databricks, Spark, Python / SQL Responsibilities • Design, develop, and optimize data workflows and notebooks using Databricks to ingest, transform, and load data from various sources into the data lake. • Build and maintain scalable and efficient data processing workflows using Spark (PySpark or Spark SQL) by following coding standards and best practices. • Collaborate with technical and business stakeholders to understand data requirements and translate them into technical solutions. • Develop data models and schemas to support reporting and analytics needs. • Ensure data quality, integrity, and security by implementing appropriate checks and controls. • Monitor and optimize data processing performance, identifying, and resolving bottlenecks. • Stay up to date with the latest advancements in data engineering and Databricks technologies. Qualifications • Bachelors or masters degree in any field • 6-10 years of experience in designing, implementing, and maintaining data solutions on Databricks • Experience with at least one of the popular cloud platforms – Azure, AWS or GCP • Experience with ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) processes • Knowledge of data warehousing and data modelling concepts • Experience with Python or SQL • Experience with Delta Lake • Understanding of DevOps principles and practices • Excellent problem-solving and troubleshooting skills • Strong communication and teamwork skills Your role as a leader At Deloitte India, we believe in the importance of leadership at all levels. We expect our people to embrace and live our purpose by challenging themselves to identify issues that are most important for our clients, our people, and for society and make an impact that matters. In addition to living our purpose, Senior Consultant across our organization: Develop high-performing people and teams through challenging and meaningful opportunities Deliver exceptional client service; maximize results and drive high performance from people while fostering collaboration across businesses and borders Influence clients, teams, and individuals positively, leading by example and establishing confident relationships with increasingly senior people Understand key objectives for clients and Deloitte; align people to objectives and set priorities and direction. Acts as a role model, embracing and living our purpose and values, and recognizing others for the impact they make How you will grow At Deloitte, our professional development plan focuses on helping people at every level of their career to identify and use their strengths to do their best work every day. From entry-level employees to senior leaders, we believe there is always room to learn. We offer opportunities to help build excellent skills in addition to hands-on experience in the global, fast-changing business world. From on-the-job learning experiences to formal development programs at Deloitte University, our professionals have a variety of opportunities to continue to grow throughout their career. Explore Deloitte University, The Leadership Centre. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our purpose Deloitte is led by a purpose: To make an impact that matters . Every day, Deloitte people are making a real impact in the places they live and work. We pride ourselves on doing not only what is good for clients, but also what is good for our people and the Communities in which we live and work—always striving to be an organization that is held up as a role model of quality, integrity, and positive change. Learn more about Deloitte's impact on the world Recruiter tips We want job seekers exploring opportunities at Deloitte to feel prepared and confident. To help you with your interview, we suggest that you do your research: know some background about the organization and the business area you are applying to. Check out recruiting tips from Deloitte professionals.

Posted 1 month ago

Apply

7.0 - 12.0 years

15 - 27 Lacs

Bengaluru

Hybrid

Labcorp is hiring a Senior Data engineer. This person will be an integrated member of Labcorp Data and Analytics team and work within the IT team. Play a crucial role in designing, developing and maintaining data solutions using Databricks, Fabric, Spark, PySpark and Python. Responsible to review business requests and translate them into technical solution and technical specification. In addition, work with team members to mentor fellow developers to grow their knowledge and expertise. Work in a fast paced and high-volume processing environment, where quality and attention to detail are vital. RESPONSIBILITIES: Design and implement end-to-end data engineering solutions by leveraging the full suite of Databricks, Fabric tools, including data ingestion, transformation, and modeling. Design, develop and maintain end-to-end data pipelines by using spark, ensuring scalability, reliability, and cost optimized solutions. Conduct performance tuning and troubleshooting to identify and resolve any issues. Implement data governance and security best practices, including role-based access control, encryption, and auditing. Work in fast-paced environment and perform effectively in an agile development environment. REQUIREMENTS: 8+ years of experience in designing and implementing data solutions with at least 4+ years of experience in data engineering. Extensive experience with Databricks, Fabric, including a deep understanding of its architecture, data modeling, and real-time analytics. Minimum 6+ years of experience in Spark, PySpark and Python. Must have strong experience in SQL, Spark SQL, data modeling & RDBMS concepts. Strong knowledge of Data Fabric services, particularly Data engineering, Data warehouse, Data factory, and Real- time intelligence. Strong problem-solving skills, with ability to perform multi-tasking. Familiarity with security best practices in cloud environments, Active Directory, encryption, and data privacy compliance. Communicate effectively in both oral and written. Experience in AGILE development, SCRUM and Application Lifecycle Management (ALM). Preference given to current or former Labcorp employees. EDUCATION: Bachelors in engineering, MCA.

Posted 1 month ago

Apply

6.0 - 11.0 years

11 - 21 Lacs

Kolkata, Pune, Chennai

Work from Office

Role & responsibilities Data Engineer, Expertise in AWS, Databricks and Pyspark

Posted 1 month ago

Apply

6.0 - 11.0 years

8 - 18 Lacs

Hyderabad, Bengaluru, Mumbai (All Areas)

Work from Office

Role & responsibilities Data Engineer, Expertise in AWS, Databricks and Pyspark

Posted 1 month ago

Apply

5.0 - 7.0 years

15 - 22 Lacs

Chennai

Work from Office

Role & responsibilities : Job Description: Primarily looking for a Data Engineer (AWS) with expertise in processing data pipelines using Data bricks, PySpark SQL on Cloud distributions like AWS Must have AWS Data bricks ,Good-to-have PySpark, Snowflake, Talend Requirements- • Candidate must be experienced working in projects involving • Other ideal qualifications include experiences in • Primarily looking for a data engineer with expertise in processing data pipelines using Databricks Spark SQL on Hadoop distributions like AWS EMR Data bricks Cloudera etc. • Should be very proficient in doing large scale data operations using Databricks and overall very comfortable using Python • Familiarity with AWS compute storage and IAM concepts • Experience in working with S3 Data Lake as the storage tier • Any ETL background Talend AWS Glue etc. is a plus but not required • Cloud Warehouse experience Snowflake etc. is a huge plus • Carefully evaluates alternative risks and solutions before taking action. • Optimizes the use of all available resources • Develops solutions to meet business needs that reflect a clear understanding of the objectives practices and procedures of the corporation department and business unit • Skills • Hands on experience on Databricks Spark SQL AWS Cloud platform especially S3 EMR Databricks Cloudera etc. • Experience on Shell scripting • Exceptionally strong analytical and problem-solving skills • Relevant experience with ETL methods and with retrieving data from dimensional data models and data warehouses • Strong experience with relational databases and data access methods especially SQL • Excellent collaboration and cross functional leadership skills • Excellent communication skills both written and verbal • Ability to manage multiple initiatives and priorities in a fast-paced collaborative environment • Ability to leverage data assets to respond to complex questions that require timely answers • has working knowledge on migrating relational and dimensional databases on AWS Cloud platform Skills Mandatory Skills: Apache Spark, Databricks, Java, Python, Scala, Spark SQL. Note : Need only Immediate joiners/ Serving notice period. Interested candidates can apply. Regards, HR Manager

Posted 1 month ago

Apply

5.0 - 8.0 years

15 - 27 Lacs

Chennai

Hybrid

Key Responsibilities: Technical Skills: Strong proficiency in SQL for data manipulation and querying. Knowledge of Python scripting for data processing and automation. Experience in Reltio Integration Hub (RIH) and handling API-based integrations. Familiarity with Data Modelling Matching, Survivorship concepts and methodologies. Experience with D&B, ZoomInfo, and Salesforce connectors for data enrichment. Understanding of MDM workflow configurations and role-based data governance Experience with AWS Databricks, Data Lake and Warehouse Implement and configure MDM solutions using Reltio while ensuring alignment with business requirements and best practices. Develop and maintain data models, workflows, and business rules within the MDM platform. Work on Reltio Workflow (DCR Workflow & Custom Workflow) to manage data approvals and role-based assignments. Support data integration efforts using Reltio Integration Hub (RIH) to facilitate data movement across multiple systems. Develop ETL pipelines using SQL, Python, and integration tools to extract, transform, and load (ETL) data. Work with D&B, ZoomInfo, and Salesforce connectors for data enrichment and integration. Perform data analysis and profiling to identify data quality issues and recommend solutions for data cleansing and enrichment. Collaborate with stakeholders to define and document data governance policies, procedures, and standards. Optimize MDM workflows to enhance data stewardship and governance.

Posted 1 month ago

Apply

8.0 - 12.0 years

15 - 27 Lacs

Mumbai, Pune, Bengaluru

Work from Office

Role & responsibilities : Job Description: Primarily looking for a Data Engineer (AWS) with expertise in processing data pipelines using Data bricks, PySpark SQL on Cloud distributions like AWS Must have AWS Data bricks ,Good-to-have PySpark, Snowflake, Talend Requirements- • Candidate must be experienced working in projects involving • Other ideal qualifications include experiences in • Primarily looking for a data engineer with expertise in processing data pipelines using Databricks Spark SQL on Hadoop distributions like AWS EMR Data bricks Cloudera etc. • Should be very proficient in doing large scale data operations using Databricks and overall very comfortable using Python • Familiarity with AWS compute storage and IAM concepts • Experience in working with S3 Data Lake as the storage tier • Any ETL background Talend AWS Glue etc. is a plus but not required • Cloud Warehouse experience Snowflake etc. is a huge plus • Carefully evaluates alternative risks and solutions before taking action. • Optimizes the use of all available resources • Develops solutions to meet business needs that reflect a clear understanding of the objectives practices and procedures of the corporation department and business unit • Skills • Hands on experience on Databricks Spark SQL AWS Cloud platform especially S3 EMR Databricks Cloudera etc. • Experience on Shell scripting • Exceptionally strong analytical and problem-solving skills • Relevant experience with ETL methods and with retrieving data from dimensional data models and data warehouses • Strong experience with relational databases and data access methods especially SQL • Excellent collaboration and cross functional leadership skills • Excellent communication skills both written and verbal • Ability to manage multiple initiatives and priorities in a fast-paced collaborative environment • Ability to leverage data assets to respond to complex questions that require timely answers • has working knowledge on migrating relational and dimensional databases on AWS Cloud platform Skills Mandatory Skills: Apache Spark, Databricks, Java, Python, Scala, Spark SQL. Note : Need only Immediate joiners/ Serving notice period. Interested candidates can apply. Regards, HR Manager

Posted 1 month ago

Apply

7.0 - 8.0 years

7 - 9 Lacs

Bengaluru

Work from Office

We are seeking an experienced Data Engineer to join our innovative data team data team and help build scalable data infrastructure, software consultancy, and development services that powers business intelligence, analytics, and machine learning initiatives. The ideal candidate will design, develop, and maintain robust high-performance data pipelines and solutions while ensuring data quality, reliability, and accessibility across the organization working with cutting-edge technologies like Python, Microsoft Fabric, Snowflake, Dataiku, SQL Server, Oracle, PostgreSQL, etc. Required Qualifications 5 + years of experience in Data engineering role. Programming Languages: Proficiency in Python Cloud Platforms: Hands-on experience with Azure (Fabric, Synapse, Data Factory, Event Hubs) Databases: Strong SQL skills and experience with both relational (Microsoft SQL Server, PostgreSQL, MySQL) and NoSQL (MongoDB, Cassandra) databases Version Control: Proficiency with Git and collaborative development workflows Proven track record of building production-grade data pipelines handling large-scale data or solutions. Desired experience with containerization (Docker) and orchestration (Kubernetes) technologies . Knowledge of machine learning workflows and MLOps practices Familiarity with data visualization tools (Tableau, Looker, Power BI) Experience with stream processing and real-time analytics Experience with data governance and compliance frameworks (GDPR, CCPA) Contributions to open-source data engineering projects Relevant Cloud certifications (e.g., Microsoft Certified: Azure Data Engineer Associate, AWS Certified Data Engineer, Google Cloud Professional Data Engineer). Specific experience or certifications in Microsoft Fabric, or Dataiku, Snowflake.

Posted 2 months ago

Apply

8.0 - 12.0 years

15 - 27 Lacs

Mumbai, Pune, Bengaluru

Work from Office

Role & responsibilities : Job Description: Primarily looking for a Data Engineer (AWS) with expertise in processing data pipelines using Data bricks, PySpark SQL on Cloud distributions like AWS Must have AWS Data bricks ,Good-to-have PySpark, Snowflake, Talend Requirements- • Candidate must be experienced working in projects involving • Other ideal qualifications include experiences in • Primarily looking for a data engineer with expertise in processing data pipelines using Databricks Spark SQL on Hadoop distributions like AWS EMR Data bricks Cloudera etc. • Should be very proficient in doing large scale data operations using Databricks and overall very comfortable using Python • Familiarity with AWS compute storage and IAM concepts • Experience in working with S3 Data Lake as the storage tier • Any ETL background Talend AWS Glue etc. is a plus but not required • Cloud Warehouse experience Snowflake etc. is a huge plus • Carefully evaluates alternative risks and solutions before taking action. • Optimizes the use of all available resources • Develops solutions to meet business needs that reflect a clear understanding of the objectives practices and procedures of the corporation department and business unit • Skills • Hands on experience on Databricks Spark SQL AWS Cloud platform especially S3 EMR Databricks Cloudera etc. • Experience on Shell scripting • Exceptionally strong analytical and problem-solving skills • Relevant experience with ETL methods and with retrieving data from dimensional data models and data warehouses • Strong experience with relational databases and data access methods especially SQL • Excellent collaboration and cross functional leadership skills • Excellent communication skills both written and verbal • Ability to manage multiple initiatives and priorities in a fast-paced collaborative environment • Ability to leverage data assets to respond to complex questions that require timely answers • has working knowledge on migrating relational and dimensional databases on AWS Cloud platform Skills Mandatory Skills: Apache Spark, Databricks, Java, Python, Scala, Spark SQL. Note : Need only Immediate joiners/ Serving notice period. Interested candidates can apply. Regards, HR Manager

Posted 2 months ago

Apply

11 - 20 years

20 - 35 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 11 - 20 Yrs Location- Pan India Job Description : - Minimum 2 Years hands on experience in Solution Architect ( AWS Databricks ) If interested please forward your updated resume to sankarspstaffings@gmail.com With Regards, Sankar G Sr. Executive - IT Recruitment

Posted 2 months ago

Apply

12 - 18 years

20 - 35 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Role & responsibilities Job Description: Cloud DataInformation Architect Core skillset with implementing Cloud data pipelines Tools AWS Databricks Snowflake Python Fivetran Requirements Candidate must be experienced working in projects involving AWS Databricks Python AWS Native data Architecture and services like S3 lamda Glue EMR Databricks Spark Experience with handing AWS Cloud platform Responsibilities Identify define foundational business data domain data domain elements Identifyingcollaborating with data product and stewards in business circles to capture data definitions Driving data sourceLineage report Reference data needs identification Recommending data extraction and replication patterns Experience on data migration from big data to AWS Cloud on S3 Snowflake Redshift Understands where to obtain information needed to make the appropriate decisions Demonstrates ability to break down a problem to manageable pieces and implement effective timely solutions Identifies the problem versus the symptom Manages problems that require involvement of others to solve Reaches sound decisions quickly Carefully evaluates alternative risks and solutions before taking action Optimizes the use of all available resources Develops solutions to meet business needs that reflect a clear understanding of the objectives practices and procedures of the corporation department and business unit Skills Hands on experience on AWS databricks especially S3 Snowflake python Experience on Shell scripting Exceptionally strong analytical and problem solving skills Relevant experience with ETL methods and with retrieving data from dimensional data models and data warehouses Strong experience with relational databases and data access methods especially SQL Excellent collaboration and crossfunctional leadership skills Excellent communication skills both written and verbal Ability to manage multiple initiatives and priorities in a fastpaced collaborative environment Ability to leverage data assets to respond to complex questions that require timely answers Has working knowledge on migrating relational and dimensional databases on AWS Cloud platform

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies