Home
Jobs

310 Data Lake Jobs - Page 3

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

12.0 - 15.0 years

5 - 5 Lacs

Thiruvananthapuram

Work from Office

Naukri logo

Senior Data Architect - Big Data & Cloud Solutions Experience: 10+ Years Industry: Information Technology / Data Engineering / Cloud Computing Job Summary: We are seeking a highly experienced and visionary Data Architect to lead the design and implementation of scalable, high-performance data solutions. The ideal candidate will have deep expertise in Apache Kafka, Apache Spark, AWS Glue, PySpark, and cloud-native architectures, with a strong background in solution architecture and enterprise data strategy. Key Responsibilities: Design and implement end-to-end data architecture solutions on AWS using Glue, S3, Redshift, and other services. Architect and optimize real-time data pipelines using Apache Kafka and Spark Streaming. Lead the development of ETL/ELT workflows using PySpark and AWS Glue. Collaborate with stakeholders to define data strategies, governance, and best practices. Ensure data quality, security, and compliance across all data platforms. Provide technical leadership and mentorship to data engineers and developers. Evaluate and recommend new tools and technologies to improve data infrastructure. Translate business requirements into scalable and maintainable data solutions. Required Skills & Qualifications: 10+ years of experience in data engineering, architecture, or related roles. Strong hands-on experience with: Apache Kafka (event streaming, topic design, schema registry) Apache Spark (batch and streaming) AWS Glue, S3, Redshift, Lambda, CloudFormation/Terraform PySpark for large-scale data processing Proven experience in solution architecture and designing cloud-native data platforms. Deep understanding of data modeling, data lakes, and data warehousing concepts. Strong programming skills in Python and SQL. Experience with CI/CD pipelines and DevOps practices for data workflows. Excellent communication and stakeholder management skills. Preferred Qualifications: AWS Certified Solutions Architect or Big Data Specialty certification. Experience with data governance tools and frameworks. Familiarity with containerization (Docker, Kubernetes) and orchestration tools (Airflow, Step Functions). Exposure to machine learning pipelines and MLOps is a plus. Required Skills Apache,Pyspark,Aws Cloud,Kafka

Posted 4 days ago

Apply

8.0 - 10.0 years

3 - 6 Lacs

Chennai

Work from Office

Naukri logo

About the Role: Senior Business Intelligence Analyst The BusinessIntelligence Analyst is responsible for collecting and analyzing data frommultiple sources systems, to help organization make better business decisions.This role is crucial in maintaining data quality, compliance, and accessibilitywhile driving data-driven decision-making and reporting for Mind sprintclients. The role requires a combination of OLAM business domain expertise,problem-solving skills, and business acumen. Create,review, validate and manage data as it collected. The person will act ascustodian of data getting generated. Developpolicies and procedures for the collection and analysis of data. Possessanalytical skills to analyze data to derive meaningful insights. Skill togenerate predictive and insightful reports. Build dailyreports and schedule internal weekly and monthly meetings, preparing in advanceto share relevant and beneficial information. Data Ownership: Assume ownership of specific datasets, data dictionaries, metadata, masterdata and ensure data accuracy, completeness, and relevance. Data Integration: Collaborate with system owners,data engineers, domain experts and integration teams to facilitate the smooth integrationof financial data from multiple systems/entities into the financialtransactional and analytical datamarts. Data Quality Assurance: Establish and enforce dataquality standards and policies within the financial domain. Collaborate withdata engineers, analytics, data stewards and data custodians to monitor andimprove data quality. Data Access Control: Control and manage access todata, ensuring appropriate permissions and security measures are in place.Monitor and audit data access to prevent unauthorized use Data Reporting and Analysis: Collaborate withfinance teams to generate accurate and timely financial reports. Perform dataanalysis to identify trends, anomalies, and insights in financial data,supporting financial modelling, forecasting, and predictive decision-making. Collaborate with co-workers andmanagement to implement improvements. Job Qualifications: Masters/bachelors in financeand accounting or related fields. An advanced degree is a plus. Proven experience in financialdata management, data governance, and data analysis. Demonstrated ability to approachcomplex problems with analytical and critical thinking skills. Excellent written and verbalcommunication skills Leadership skills and theability to collaborate effectively with cross-functional teams. Ability to influence andinteract with senior management. Preferred Qualifications & Skills Knowledge in Big Data, DataLake, Azure Data Factory (ADF), Snowflake, DataBricks Synapse, MonteCarlo,Atlin and DevOpS tools like DBT. Agile Project Management Skillswith knowledge of JIRA & Confluence Good understanding of financialconcepts like Balance Sheet, P&L, TB, Direct Costs Management, Fair value,Book Value, Production/Standard costs, Stock Valuations, Ratios, andSustainability Finance. Experience in working with ERPdata especially SAP FI and SAP CO. Strategic mindset and theability to identify opportunities to use data to drive business growth. Youshould be able to think creatively and identify innovative solutions to complexproblems. Required abilities Physical: Other: Work Environment Details: Specific requirements Travel: Vehicle: Work Permit: Other details Pay Rate: Contract Types: Time Constraints: Compliance Related: Union Affiliation:

Posted 4 days ago

Apply

4.0 - 9.0 years

11 - 20 Lacs

Pune

Work from Office

Naukri logo

Role : Sr. Database Engineer Location : Pune Exp : 5-7 years Job Description : Oracle PLSql Developer The Associate shall perform the role of PLSQL Developer and shall be responsible for the following: • Hands on coding in SQL and PLSQL • Create/implement database architecture for new applications and enhancements to existing applications • Hands-on experience in Data Modeling, SSAS, Cubes, query Optimization • Create/implement strategies for partitioning, archiving and maturity models for applications. • Review queries created by other developers for adherence to standards and performance issues • PLSQL, TSQL, SQL Query Optimization, Data Models, Data lakes • Interact with Database, Applications analysts and Business users for estimations. • Do impact analysis of existing applications and suggest best ways of incorporating new requirements Proactively engage in the remediation of software issues related to code quality, security, and/or pattern/frameworks. Interested candidates can share their resume at Neesha1@damcogroup.com

Posted 5 days ago

Apply

5.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Power BI Developer Must Have Power BI, Data Integration and Visualization Specialist and Data Lakes and Dashboard Developer Good to have - Tableau and ELK stack (Elasticsearch, Logstash, Kibana) to retrieve, analyze, and visualize log or event data. Experience 7+ JD - Develop and Manage Power BI SolutionsDesign, develop, and maintain Power BI reports and dashboards that generate comprehensive reports and insights to support business decision-making. Which needs to be built from scratch. Data Lake IntegrationIntegrate data from various sources into a data lake, ensuring data is clean, accurate, and accessible & troubleshooting and resolve issues related to data integration and dashboard functionality Data ModelingCreate and maintain data models to support reporting and analytics needs. ETL ProcessesDesign and implement ETL (Extract, Transform, Load) processes to move data from source systems to the data lake and Power BI. Performance OptimizationOptimize Power BI reports and dashboards for performance and usability. DocumentationDocument data models, ETL processes, and Power BI solutions to ensure maintainability and knowledge sharing. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Power BI Visualization on cloud. Experience5-8 Years.

Posted 5 days ago

Apply

5.0 - 10.0 years

9 - 19 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Key Responsibilities: ¢ Work on client projects to deliver AWS, PySpark, Databricks based Data engineering & Analytics solutions. €¢ Build and operate very large data warehouses or data lakes. €¢ ETL optimization, designing, coding, & tuning big data processes using Apache Spark. €¢ Build data pipelines & applications to stream and process datasets at low latencies. €¢ Show efficiency in handling data - tracking data lineage, ensuring data quality, and improving discoverability of data. Technical Experience: €¢ Minimum of 5 years of experience in Databricks engineering solutions on AWS Cloud platforms using PySpark, Databricks SQL, Data pipelines using Delta Lake. €¢ Minimum of 5 years of experience years of experience in ETL, Big Data/Hadoop and data warehouse architecture & delivery. Email at- maya@mounttalent.com

Posted 1 week ago

Apply

5.0 - 10.0 years

0 - 0 Lacs

Pune, Chennai

Hybrid

Naukri logo

Ciklum is looking for a Senior Microsoft Fabric Data Engineer to join our team full-time in India. We are a custom product engineering company that supports both multinational organizations and scaling startups to solve their most complex business challenges. With a global team of over 4,000 highly skilled developers, consultants, analysts and product owners, we engineer technology that redefines industries and shapes the way people live. About the role: We are seeking a highly skilled and experienced Senior Microsoft Fabric Data Engineer to design, develop, and optimize advanced data solutions leveraging the Microsoft Fabric platform. You will be responsible for building robust, scalable data pipelines, integrating diverse and large-scale data sources, and enabling sophisticated analytics and business intelligence capabilities. This role requires extensive hands-on expertise with Microsoft Fabric, a deep understanding of Azure data services, and mastery of modern data engineering practices. Responsibilities: Lead the design and implementation of highly scalable and efficient data pipelines and data warehouses using Microsoft Fabric and a comprehensive suite of Azure services (Data Factory, Synapse Analytics, Azure SQL, Data Lake) Develop, optimize, and oversee complex ETL/ELT processes for data ingestion, transformation, and loading from a multitude of disparate sources, ensuring high performance with large-scale datasets Ensure the highest level of data integrity, quality, and governance throughout the entire Fabric environment, establishing best practices for data management Collaborate extensively with stakeholders, translating intricate business requirements into actionable, resilient, and optimized data solutions Proactively troubleshoot, monitor, and fine-tune data pipelines and workflows for peak performance and efficiency, particularly in handling massive datasets Architect and manage workspace architecture, implement robust user access controls, and enforce data security in strict compliance with privacy regulations Automate platform tasks and infrastructure management using advanced scripting languages (Python, PowerShell) and Infrastructure as Code (Terraform, Ansible) principles Document comprehensive technical solutions, enforce code modularity, and champion best practices in version control and documentation across the team Stay at the forefront of Microsoft Fabric updates, new features, and contribute significantly to continuous improvement initiatives and the adoption of cutting-edge technologies Requirements: Minimum of 5+ years of progressive experience in data engineering, with at least 3 years of hands-on, in-depth work on Microsoft Fabric and a wide array of Azure data services Exceptional proficiency in SQL, Python, and advanced data transformation tools (e.g., Spark, PySpark notebooks) Mastery of data warehousing concepts, dimensional modeling, and advanced ETL best practices Extensive experience with complex hybrid cloud and on-premises data integration scenarios Profound understanding of data governance, security protocols, and compliance standards Excellent problem-solving, analytical, and communication skills, with the ability to articulate complex technical concepts clearly to both technical and non-technical audiences Desirable: Experience with Power BI, Azure Active Directory, and managing very large-scale data infrastructure Strong familiarity with Infrastructure as Code and advanced automation tools Bachelors degree in Computer Science, Engineering, or a related field (or equivalent extensive experience) What's in it for you? Care: your mental and physical health is our priority. We ensure comprehensive company-paid medical insurance, as well as financial and legal consultation Tailored education path: boost your skills and knowledge with our regular internal events (meetups, conferences, workshops), Udemy licence, language courses and company-paid certifications Growth environment: share your experience and level up your expertise with a community of skilled professionals, locally and globally Flexibility: hybrid work mode at Chennai or Pune Opportunities: we value our specialists and always find the best options for them. Our Resourcing Team helps change a project if needed to help you grow, excel professionally and fulfil your potential Global impact: work on large-scale projects that redefine industries with international and fast-growing clients Welcoming environment: feel empowered with a friendly team, open-door policy, informal atmosphere within the company and regular team-building events About us: At Ciklum, we are always exploring innovations, empowering each other to achieve more, and engineering solutions that matter. With us, youll work with cutting-edge technologies, contribute to impactful projects, and be part of a One Team culture that values collaboration and progress. India is a strategic innovation hub for Ciklum, with growing teams in Chennai and Pune leading advancements in EdgeTech, AR/VR, IoT, and beyond. Join us to collaborate on game-changing solutions and take your career to the next level. Want to learn more about us? Follow us on Instagram , Facebook , LinkedIn . Explore, empower, engineer with Ciklum! Experiences of tomorrow. Engineered together Interested already? We would love to get to know you! Submit your application. Cant wait to see you at Ciklum.

Posted 1 week ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Chennai

Work from Office

Naukri logo

Join us in bringing joy to customer experience. Five9 is a leading provider of cloud contact center software, bringing the power of cloud innovation to customers worldwide, Living our values everyday results in our team-first culture and enables us to innovate, grow, and thrive while enjoying the journey together. We celebrate diversity and foster an inclusive environment, empowering our employees to be their authentic selves, The Data Engineer will help design and implement a Google Cloud Platform (GCP) Data Lake, build scalable data pipelines, and ensure seamless access to data for business intelligence and data science tools. They will support a wide range of projects while collaborating closely with management teams and business leaders. The ideal candidate will have a strong understanding of data engineering principles, data warehousing concepts, and the ability to document technical knowledge into clear processes and procedures, This position is based out of one of the offices of our affiliate Acqueon Technologies in India, and will adopt the hybrid work arrangements of that location. You will be a member of the Acqueon team with responsibilities supporting Five9 products, collaborating with global teammates based primarily in the United States, Responsibilities. Design, implement, and maintain a scalable Data Lake on GCP to centralize structured and unstructured data from various sources (databases, APIs, cloud storage), Utilize GCP services including BigQuery, Dataflow, Pub/Sub, and Cloud Storage to optimize and manage data workflows, ensuring scalability, performance, and security, Collaborate closely with data analytics and data science teams to understand data needs, ensuring data is properly prepared for consumption by various systems (e-g. DOMO, Looker, Databricks). Implement best practices for data quality, consistency, and governance across all data pipelines and systems, ensuring compliance with internal and external standards, Continuously monitor, test, and optimize data workflows to improve performance, cost efficiency, and reliability, Maintain comprehensive technical documentation of data pipelines, systems, and architecture for knowledge sharing and future development, Requirements. Bachelor's degree in Computer Science, Data Engineering, Data Science, or a related quantitative field (e-g. Mathematics, Statistics, Engineering), 3+ years of experience using GCP Data Lake and Storage Services. Certifications in GCP are preferred (e-g. Professional Cloud Developer, Professional Cloud Database Engineer), Advanced proficiency with SQL, with experience in writing complex queries, optimizing for performance, and using SQL in large-scale data processing workflows, Proficiency in programming languages such as Python, Java, or Scala, with practical experience building data pipelines, automating data workflows, and integrating APIs for data ingestion, Five9 embraces diversity and is committed to building a team that represents a variety of backgrounds, perspectives, and skills. The more inclusive we are, the better we are. Five9 is an equal opportunity employer, View our privacy policy, including our privacy notice to California residents here: https://www,five9,/pt-pt/legal, Note: Five9 will never request that an applicant send money as a prerequisite for commencing employment with Five9, Show more Show less

Posted 1 week ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Chennai

Work from Office

Naukri logo

Join us in bringing joy to customer experience. Five9 is a leading provider of cloud contact center software, bringing the power of cloud innovation to customers worldwide, Living our values everyday results in our team-first culture and enables us to innovate, grow, and thrive while enjoying the journey together. We celebrate diversity and foster an inclusive environment, empowering our employees to be their authentic selves, The Data Engineer will help design and implement a Google Cloud Platform (GCP) Data Lake, build scalable data pipelines, and ensure seamless access to data for business intelligence and data science tools. They will support a wide range of projects while collaborating closely with management teams and business leaders. The ideal candidate will have a strong understanding of data engineering principles, data warehousing concepts, and the ability to document technical knowledge into clear processes and procedures, This position is based out of one of the offices of our affiliate Acqueon Technologies in India, and will adopt the hybrid work arrangements of that location. You will be a member of the Acqueon team with responsibilities supporting Five9 products, collaborating with global teammates based primarily in the United States, Responsibilities. Design, implement, and maintain a scalable Data Lake on GCP to centralize structured and unstructured data from various sources (databases, APIs, cloud storage), Utilize GCP services including BigQuery, Dataflow, Pub/Sub, and Cloud Storage to optimize and manage data workflows, ensuring scalability, performance, and security, Collaborate closely with data analytics and data science teams to understand data needs, ensuring data is properly prepared for consumption by various systems (e-g. DOMO, Looker, Databricks). Implement best practices for data quality, consistency, and governance across all data pipelines and systems, ensuring compliance with internal and external standards, Continuously monitor, test, and optimize data workflows to improve performance, cost efficiency, and reliability, Maintain comprehensive technical documentation of data pipelines, systems, and architecture for knowledge sharing and future development, Requirements. Bachelor's degree in Computer Science, Data Engineering, Data Science, or a related quantitative field (e-g. Mathematics, Statistics, Engineering), 4+ years of experience using GCP Data Lake and Storage Services. Certifications in GCP are preferred (e-g. Professional Cloud Developer, Professional Cloud Database Engineer), Advanced proficiency with SQL, with experience in writing complex queries, optimizing for performance, and using SQL in large-scale data processing workflows, Proficiency in programming languages such as Python, Java, or Scala, with practical experience building data pipelines, automating data workflows, and integrating APIs for data ingestion, Five9 embraces diversity and is committed to building a team that represents a variety of backgrounds, perspectives, and skills. The more inclusive we are, the better we are. Five9 is an equal opportunity employer, View our privacy policy, including our privacy notice to California residents here: https://www,five9,/pt-pt/legal, Note: Five9 will never request that an applicant send money as a prerequisite for commencing employment with Five9, Show more Show less

Posted 1 week ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

Job Title: Automation EngineerDatabricks. Job Type: Full-time, Contractor. Location: Hybrid Hyderabad | Pune| Delhi. About Us:. Our mission at micro1 is to match the most talented people in the world with their dream jobs. If you are looking to be at the forefront of AI innovation and work with some of the fastest-growing companies in Silicon Valley, we invite you to apply for a role. By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market.. Job Summary:. We are seeking a detail-oriented and innovative Automation EngineerDatabricks to join our customer's team. In this critical role, you will design, develop, and execute automated tests to ensure the quality, reliability, and integrity of data within Databricks environments. If you are passionate about data quality, thrive in collaborative environments, and excel at both written and verbal communication, we'd love to meet you.. Key Responsibilities:. Design, develop, and maintain robust automated test scripts using Python, Selenium, and SQL to validate data integrity within Databricks environments.. Execute comprehensive data validation and verification activities to ensure accuracy and consistency across multiple systems, data warehouses, and data lakes.. Create detailed and effective test plans and test cases based on technical requirements and business specifications.. Integrate automated tests with CI/CD pipelines to facilitate seamless and efficient testing and deployment processes.. Work collaboratively with data engineers, developers, and other stakeholders to gather data requirements and achieve comprehensive test coverage.. Document test cases, results, and identified defects; communicate findings clearly to the team.. Conduct performance testing to ensure data processing and retrieval meet established benchmarks.. Provide mentorship and guidance to junior team members, promoting best practices in test automation and data validation.. Required Skills and Qualifications:. Strong proficiency in Python, Selenium, and SQL for developing test automation solutions.. Hands-on experience with Databricks, data warehouse, and data lake architectures.. Proven expertise in automated testing of data pipelines, preferably with tools such as Apache Airflow, dbt Test, or similar.. Proficient in integrating automated tests within CI/CD pipelines on cloud platforms (AWS, Azure preferred).. Excellent written and verbal communication skills with the ability to translate technical concepts to diverse audiences.. Bachelor’s degree in Computer Science, Information Technology, or a related discipline.. Demonstrated problem-solving skills and a collaborative approach to teamwork.. Preferred Qualifications:. Experience with implementing security and data protection measures in data-driven applications.. Ability to integrate user-facing elements with server-side logic for seamless data experiences.. Demonstrated passion for continuous improvement in test automation processes, tools, and methodologies.. Show more Show less

Posted 1 week ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

About Us. We are one of India’s most exciting & fast-growing mobile gaming companies. Founded in 2014, and creating a global mobile gaming landscape in partnership with Modern Times Group (MTG), our Vision is to create simple, impactful casual game experiences at a massive scale. Since our inception, we have built a worldwide network of chart-topping games, and powerful tech & analytics infrastructure to turbocharge their growth. Our product portfolio consists of evergreen hits like Daily Themed Crossword, WordTrip, WordJam, WordWars, WordTrek, TileMatch and Jigsaw. Visit us at www.playsimple.in to know more.. Role: Senior Data Scientist / Lead Data Scientist. Position Summary. We’re looking for Data Scientists to join our Central Analytics team. It’s a fast-paced, high-adrenaline job, with plenty to learn. If you enjoy crunching numbers, are never satisfied with the products around you, and have always wanted to make things better than the best, you’ll love this job.. What’s required of you. You will work closely with product leaders as a data-driven advisor and partner on strategic issues.. Will work collaboratively with game teams to deliver actionable insights into our games to further increase user acquisition, engagement and. monetization.. Will proactively perform a wide range of analyses to identify trends, issues, and opportunities across games help us continue to improve gameplay.. Will answer business-related questions through exploratory data analyses and ad-hoc reporting.. Requirements. Tech/M. Hands on experience in Python/Spark for data crunching, data visualization and machine learning.. SQL skills: Experience querying large, complex datasets from a data lake/warehouse.. Ability to execute research projects, and generate practical results and recommendations.. The candidate should enjoy both working with people and undertaking rigorous statistical analyses.. Knowledge of Deep Learning is highly desirable. Show more Show less

Posted 1 week ago

Apply

2.0 - 6.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Flexing It is a freelance consulting marketplace that connects freelancers and independent consultants with organisations seeking independent talent.. Flexing It has partnered with Our client, a global leader in energy management and automation, is seeking a Data engineer to prepare data and make it available in an efficient and optimized format for their different data consumers, ranging from BI and analytics to data science applications. It requires to work with current technologies in particular Apache Spark, Lambda & Step Functions, Glue Data Catalog, and RedShift on AWS environment.. Key Responsibilities:. Design and develop new data ingestion patterns into IntelDS Raw and/or Unified data layers based on the requirements and needs for connecting new data sources or for building new data objects. Working in ingestion patterns allow to automate the data pipelines.. Participate to and apply DevSecOps practices by automating the integration and delivery of data pipelines in a cloud environment. This can include the design and implementation of end-to-end data integration tests and/or CICD pipelines.. Analyze existing data models, identify and implement performance optimizations for data. ingestion and data consumption. The objective is to accelerate data availability within the. platform and to consumer applications.. Support client applications in connecting and consuming data from the platform, and ensure they follow our guidelines and best practices.. Participate in the monitoring of the platform and debugging of detected issues and bugs. Skills required:. Minimum of 3 years prior experience as data engineer with proven experience on Big Data and Data Lakes on a cloud environment.. Bachelor or Master degree in computer science or applied mathematics (or equivalent). Proven experience working with data pipelines / ETL / BI regardless of the technology.. Proven experience working with AWS including at least 3 of: RedShift, S3, EMR, Cloud. Formation, DynamoDB, RDS, lambda.. Big Data technologies and distributed systems: one of Spark, Presto or Hive.. Python language: scripting and object oriented.. Fluency in SQL for data warehousing (RedShift in particular is a plus).. Good understanding on data warehousing and Data modelling concepts. Familiar with GIT, Linux, CI/CD pipelines is a plus.. Strong systems/process orientation with demonstrated analytical thinking, organization. skills and problem-solving skills.. Ability to self-manage, prioritize and execute tasks in a demanding environment.. Strong consultancy orientation and experience, with the ability to form collaborative,. productive working relationships across diverse teams and cultures is a must.. Willingness and ability to train and teach others.. Ability to facilitate meetings and follow up with resulting action items. Show more Show less

Posted 1 week ago

Apply

2.0 - 4.0 years

3 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

Job Title: Automation Engineer. Job Type: Full-time, Contractor. About Us:. Our mission at micro1 is to match the most talented people in the world with their dream jobs. If you are looking to be at the forefront of AI innovation and work with some of the fastest-growing companies in Silicon Valley, we invite you to apply for a role. By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market.. Job Summary:. We are seeking a detail-oriented and innovative Automation Engineer to join our customer's team. In this critical role, you will design, develop, and execute automated tests to ensure the quality, reliability, and integrity of data within Databricks environments. If you are passionate about data quality, thrive in collaborative environments, and excel at both written and verbal communication, we'd love to meet you.. Key Responsibilities:. Design, develop, and maintain robust automated test scripts using Python, Selenium, and SQL to validate data integrity within Databricks environments.. Execute comprehensive data validation and verification activities to ensure accuracy and consistency across multiple systems, data warehouses, and data lakes.. Create detailed and effective test plans and test cases based on technical requirements and business specifications.. Integrate automated tests with CI/CD pipelines to facilitate seamless and efficient testing and deployment processes.. Work collaboratively with data engineers, developers, and other stakeholders to gather data requirements and achieve comprehensive test coverage.. Document test cases, results, and identified defects; communicate findings clearly to the team.. Conduct performance testing to ensure data processing and retrieval meet established benchmarks.. Provide mentorship and guidance to junior team members, promoting best practices in test automation and data validation.. Required Skills and Qualifications:. Strong proficiency in Python, Selenium, and SQL for developing test automation solutions.. Hands-on experience with Databricks, data warehouse, and data lake architectures.. Proven expertise in automated testing of data pipelines, preferably with tools such as Apache Airflow, dbt Test, or similar.. Proficient in integrating automated tests within CI/CD pipelines on cloud platforms (AWS, Azure preferred).. Excellent written and verbal communication skills with the ability to translate technical concepts to diverse audiences.. Bachelor’s degree in Computer Science, Information Technology, or a related discipline.. Demonstrated problem-solving skills and a collaborative approach to teamwork.. Preferred Qualifications:. Experience with implementing security and data protection measures in data-driven applications.. Ability to integrate user-facing elements with server-side logic for seamless data experiences.. Demonstrated passion for continuous improvement in test automation processes, tools, and methodologies.. Show more Show less

Posted 1 week ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Hyderabad

Work from Office

Naukri logo

Job Title: Data Engineer. Job Type: Full-Time. Location: On-site Hyderabad, Telangana, India. Job Summary:. We are seeking an accomplished Data Engineer to join one of our top customer's dynamic team in Hyderabad. You will be instrumental in designing, implementing, and optimizing data pipelines that drive our business insights and analytics. If you are passionate about harnessing the power of big data, possess a strong technical skill set, and thrive in a collaborative environment, we would love to hear from you.. Key Responsibilities:. Develop and maintain scalable data pipelines using Python, PySpark, and SQL.. Implement robust data warehousing and data lake architectures.. Leverage the Databricks platform to enhance data processing and analytics capabilities.. Model, design, and optimize complex database schemas.. Collaborate with cross-functional teams to understand data requirements and deliver actionable insights.. Lead and mentor junior data engineers and establish best practices.. Troubleshoot and resolve data processing issues promptly.. Required Skills and Qualifications:. Strong proficiency in Python and PySpark.. Extensive experience with the Databricks platform.. Advanced SQL and data modeling skills.. Demonstrated experience in data warehousing and data lake architectures.. Exceptional problem-solving and analytical skills.. Strong written and verbal communication skills.. Preferred Qualifications:. Experience with graph databases, particularly MarkLogic.. Proven track record of leading data engineering teams.. Understanding of data governance and best practices in data management.. Show more Show less

Posted 1 week ago

Apply

10.0 - 15.0 years

12 - 17 Lacs

Mumbai

Work from Office

Naukri logo

About BNP Paribas India Solutions: Established in 2005, BNP Paribas India Solutions is a wholly owned subsidiary of BNP Paribas SA, European Unions leading bank with an international reach. With delivery centers located in Bengaluru, Chennai and Mumbai, we are a 24x7 global delivery center. India Solutions services three business lines: Corporate and Institutional Banking, Investment Solutions and Retail Banking for BNP Paribas across the Group. Driving innovation and growth, we are harnessing the potential of over 10000 employees, to provide support and develop best-in-class solutions. About BNP Paribas Group: BNP Paribas is the European Unions leading bank and key player in international banking. It operates in 65 countries and has nearly 185,000 employees, including more than 145,000 in Europe. The Group has key positions in its three main fields of activity: Commercial, Personal Banking Services for the Groups commercial personal banking and several specialised businesses including BNP Paribas Personal Finance and Arval; Investment Protection Services for savings, investment, and protection solutions; and Corporate Institutional Banking, focused on corporate and institutional clients. Based on its strong diversified and integrated model, the Group helps all its clients (individuals, community associations, entrepreneurs, SMEs, corporates and institutional clients) to realize their projects through solutions spanning financing, investment, savings and protection insurance. In Europe, BNP Paribas has four domestic markets: Belgium, France, Italy, and Luxembourg. The Group is rolling out its integrated commercial personal banking model across several Mediterranean countries, Turkey, and Eastern Europe. As a key player in international banking, the Group has leading platforms and business lines in Europe, a strong presence in the Americas as well as a solid and fast-growing business in Asia-Pacific. BNP Paribas has implemented a Corporate Social Responsibility approach in all its activities, enabling it to contribute to the construction of a sustainable future, while ensuring the Group's performance and stability Commitment to Diversity and Inclusion At BNP Paribas, we passionately embrace diversity and are committed to fostering an inclusive workplace where all employees are valued, respected and can bring their authentic selves to work. We prohibit Discrimination and Harassment of any kind and our policies promote equal employment opportunity for all employees and applicants, irrespective of, but not limited to their gender, gender identity, sex, sexual orientation, ethnicity, race, colour, national origin, age, religion, social status, mental or physical disabilities, veteran status etc. As a global Bank, we truly believe that inclusion and diversity of our teams is key to our success in serving our clients and the communities we operate in. About Businessline/Function : CIB Client Engagement and Protection IT having focus on applications servicing Client Lifecycle management, Due Diligence /KYC , Customer Relation Management, Service Request Management, Referential, Transaction monitoring (AML), Data Document Platform. Landscape includes projects that are a mix of established and some under transition to new platforms, custom developed as well as commercial products. Teams are in EUR, NA, India working in a distributed mode following Agile practices, CI-CD DevSecOps Practices with focus on Automation Testing coverage, monitoring tools, observability. Job Title: Data Technical Architect Date: 01-May-25 Department: CEP IT Location: Mumbai Business Line / Function: Transaction Monitoring - AML Reports to: (Direct) Grade: (if applicable) (Functional) Number of Direct Reports: N/A Directorship / Registration: NA Position Purpose The CEP AML IT team is in charge of AML Monitoring tools for CIB and all regions. AML Monitoring tools are mainly used by Financial Security Compliance and CIB ITO LoD The purpose of the role is to seek a highly skilled and experienced AML Functional, Technical, and Data Architect with expertise in Actimize models and rules. The candidate will have a strong background in developing and optimizing database models on the AML Data Lake architecture, specifically focusing on Actimize. They will be responsible for designing, implementing, and maintaining data architecture solutions that effectively support our AML and compliance activities, with a specific emphasis on Actimize models and rules. Working closely with stakeholders, including AML analysts, data scientists, and IT teams, the candidate will ensure that the data architecture solutions align with business requirements and adhere to relevant regulations and policies, while also optimizing Actimize models and rules for enhanced detection and prevention of financial crimes. Key responsibilities will include analyzing system requirements, devising data migration strategies, and ensuring the efficient and secure storage of company information, with a focus on Actimize models and rules. Responsibilities Direct Responsibilities Collaborate with stakeholders to understand AML functional requirements and translate them into Actimize solution design and architecture and Data requirements and translate them into data architecture solutions that support AML and compliance activities. Design and implement Technical and data architecture solutions on the AML Products and Data Lake architecture, ensuring scalability, performance, and data integrity. Able to work independently with the Program Manager to understand business requirements and translate them to technical solutions in the application Configure Actimize modules and components to meet specific AML use cases and workflows. Integrate Actimize with other systems and data sources to ensure seamless data flow and information exchange. Develop and optimize database models to effectively store and retrieve AML-related data, considering data volume, complexity, and reporting needs. Establish and enforce data quality and data governance processes to ensure the accuracy, completeness, and consistency of AML data. Implement data security and access control processes to protect sensitive AML information and ensure compliance with security standards and privacy regulations. Evaluate and propose the integration of new technologies and innovative solutions to enhance AML data management processes, such as advanced analytics, machine learning, or automation. Contributing Responsibilities Technical Behavioral Competencies Minimum of 10 years of experience as a Functional, Technical, and Data Architect, with a strong focus on AML and compliance, demonstrating a deep understanding of industry best practices and regulatory requirements. Extensive expertise in Actimize technical and functional architecture, with a proven track record of successfully implementing Actimize models and rules that align with specific line of business needs. Able to work independently with the Program Manager to understand business requirements and translate them to technical solutions in the application Demonstrated proficiency in developing and optimizing Actimize functional models and rules, as well as designing and optimizing database models on the AML Data Lake architecture. Strong experience in Data-Warehouse architectural design, providing efficient and effective solutions in the Compliance AML data domains. In-depth knowledge of AML and compliance regulations and policies, ensuring compliance with industry standards and legal requirements. Exceptional analytical and problem-solving skills, with the ability to identify and address complex issues related to AML and compliance architecture. Excellent communication and interpersonal skills, enabling effective collaboration with stakeholders at all levels of the organization. Ability to work both independently and as part of a team, demonstrating strong teamwork and collaboration skills. Strong project management skills, with the ability to effectively plan, prioritize, and execute projects within defined timelines and budgets. Good experience in technical analysis of n-tier applications with multiple integrations using object oriented, APIs Microservices approaches. Very good understanding of principle behind various DevSecOps practices and working experience of industry standard tools Experience with Agile methodology is a plus, showcasing adaptability and flexibility in project delivery. Good knowledge on front-end technologies preferably Angular. Knowledge on Software methodology practice Agile Methodology SCRUM practices Business Skills IT / Business relation (Expert) Compliance Financial Security (Proficient) IT Skills: database Transversal Skills Ability to manage a project (Expert) Analytical ability (expert) Ability to understand, explain and support change (Expert) Behaviors Skills Ability to Deliver/Results driven(Expert) Ability to collaborate (Expert) Adaptability (Expert) Personal Impact/Ability to influence (Proficient) o o Resilience (Proficient) Specific Qualifications (if required) Skills Referential Behavioural Skills : (Please select up to 4 skills) Ability to deliver / Results driven Adaptability Personal Impact / Ability to influence Resilience Transversal Skills: (Please select up to 5 skills) Ability to manage a project Analytical Ability Ability to understand, explain and support change Ability to set up relevant performance indicators Choose an item. Education Level: Bachelor Degree or equivalent Experience Level At least 10 years

Posted 1 week ago

Apply

5.0 - 9.0 years

8 - 13 Lacs

Chennai

Work from Office

Naukri logo

Career Area: Technology, Digital and Data : Your Work Shapes the World at Caterpillar Inc. When you join Caterpillar, you'rejoining a global team who cares not just about the work we do but also about each other. We are the makers, problem solvers, and future world builders who are creating stronger, more sustainable communities. We don'tjust talk about progress and innovation here we make it happen, with our customers, where we work and live. Together, we are building a better world, so we can all enjoy living in it. JOB DUTIES: As a Software Engineer you will contribute to development and deployment of Caterpillars state-of-the-art digital platform. Competent to perform all programming and development assignments without close supervision; normally assigned the more complex aspects of systems work. Works directly on complex application/technical problem identification and resolution, including responding to off-shift and weekend support calls. Works independently on complex systems or infrastructure components that may be used by one or more applications or systems. Drives application development focused around delivering business valuable features Maintains high standards of software quality within the team by establishing good practices and habits. Identifies and encourage areas for growth and improvement within the team. Mentors junior developers. Communicate with end users and internal customers to help direct development, debugging, and testing of application software for accuracy, integrity, interoperability, and completeness. Performs integrated testing and customer acceptance testing of components that requires careful planning and execution to ensure timely, quality results. The position manages the completion of its own work assignments and coordinates work with others. Based on past experiences and knowledge, the incumbent normally works independently with minimal management input and review of end results. Typical customers include Caterpillar customers, dealers, other external companies who purchase services offered by Caterpillar as well as internal business unit and/or service centre groups. The position is challenged to quickly and correctly identify problems that may not be obvious. The incumbent solves problems by determining the best course of action, within departmental guidelines, from many existing solutions. The incumbent sets priorities and establishes a work plan in order to complete broadly defined assignments and achieve desired results. The position participates in brainstorming sessions focused on developing new approaches to meeting quality goals in the measure(s) stated. Basic qualifications: Position requires a four-year degree from an accredited college or university. 7+ years or more of software development experience or at least 2 years of experience with masters degree in computer science or related field. 7+ years or more of experience in designing and developing Data Pipelines in Python Top candidates will also have: Proven experience in many of the following, Designing, developing, deploying and maintaining software at scale, with good understanding of concurrency. In-Depth understanding of processing Data pipelines Expertise and experience in building Large Data Lakes and data warehouses such as Snowflake(Preferred) or Redshift or Synapse. Expertise in SQL and NO-SQL databases Extensive Data Warehousing Experience is needed At least 2+ plus years of deploying and maintaining software using public clouds such as AWS or Azure. Working within an Agile framework (ideally Scrum). Strong analytical skills. Must demonstrate solid knowledge of computer science fundamentals like data structures and algorithms and object-oriented design. Ability to work under pressure and within time constraints. Passion for technology and an eagerness to contribute to a team-oriented environment. Demonstrated leadership on medium to large-scale projects impacting strategic priorities. Bachelors degree in Computer science or Electrical engineering or related field is required. Caterpillar is an Equal Opportunity Employer (EEO). Posting Dates: June 17, 2025 - June 23, 2025 Caterpillar is an Equal Opportunity Employer. Not ready to applyJoin our Talent Community.

Posted 1 week ago

Apply

3.0 - 5.0 years

10 - 15 Lacs

Noida

Hybrid

Naukri logo

Key Responsibilities: Design, develop, and maintain scalable and efficient data pipelines using Azure Databricks Optimize and troubleshoot existing data pipelines to enhance performance and reliability. Ensure data quality, integrity, and consistency across various data sources. Implement ETL processes and manage data flows into Data Warehouses and Data Marts. Develop and optimize SQL queries on Snowflake for data processing and reporting. Utilize Python for data processing, transformation, and automation tasks. Monitor pipeline performance, proactively identify issues, and conduct necessary maintenance and updates. Maintain comprehensive documentation of data processes, architectures, and technical specifications. Required Skills: Azure Databricks PowerBI SSRS and MSSQL Snowflake Python ETL Development GitHub for version control and collaboration JIRA for work management Experience Range: 3 to 5 years Interpersonal Skills: Strong problem-solving and analytical abilities. Excellent written and verbal communication skills. Ability to work effectively within a team and collaborate.

Posted 1 week ago

Apply

8.0 - 13.0 years

30 - 45 Lacs

Hyderabad

Work from Office

Naukri logo

Role : Were looking for a skilled Databricks Solution Architect who will lead the design and implementation of data migration strategies and cloud-based data and analytics transformation on the Databricks platform. This role involves collaborating with stakeholders, analyzing data, defining architecture, building data pipelines, ensuring security and performance, and implementing Databricks solutions for machine learning and business intelligence. Key Responsibilities: Define the architecture and roadmap for cloud-based data and analytics transformation on Databricks. Design, implement, and optimize scalable, high-performance data architectures using Databricks. Build and manage data pipelines and workflows within Databricks. Ensure that best practices for security, scalability, and performance are followed. Implement Databricks solutions that enable machine learning, business intelligence, and data science workloads. Oversee the technical aspects of the migration process, from planning through to execution. Create documentation of the architecture, migration processes, and solutions. Provide training and support to teams post-migration to ensure they can leverage Databricks. Preferred candidate profile: Experience: 7+ years of experience in data engineering, cloud architecture, or related fields. 3+ years of hands-on experience with Databricks, including the implementation of data engineering solutions, migration projects, and optimizing workloads. Strong experience with cloud platforms (e.g., AWS, Azure, GCP) and their integration with Databricks. Experience in end-to-end data migration projects involving large-scale data infrastructure. Familiarity with ETL tools, data lakes, and data warehousing solutions. Skills: Expertise in Databricks architecture and best practices for data processing. Strong knowledge of Spark, Delta Lake, DLT, Lakehouse architecture, and other latest Databricks components. Proficiency in Databricks Asset Bundles Expertise in design and development of migration frameworks using Databricks Proficiency in Python, Scala, SQL, or similar languages for data engineering tasks. Familiarity with data governance, security, and compliance in cloud environments. Solid understanding of cloud-native data solutions and services.

Posted 1 week ago

Apply

6.0 - 10.0 years

7 - 15 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Role: Java + Azure : Hands on JAVA + Azure SaaS/PaaS(Azure Function, API, Microservice, API container Exp: 6-10 years of experience Responsibilities: Lead the design and implementation of scalable and secure solutions using Java and Azure Cloud. Oversee the development and deployment of cloud-based applications Hands-on in writing API/Microservice Hands on with Azure Functions, Container App, Azure Data Factory Pipeline and Data Lake

Posted 1 week ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Naukri logo

Dear Candidate, We are hiring a Cloud Data Scientist to build and scale data science solutions in cloud-native environments. Ideal for candidates who specialize in analytics and machine learning using cloud ecosystems. Key Responsibilities: Design predictive and prescriptive models using cloud ML tools Use BigQuery, SageMaker, or Azure ML Studio for scalable experimentation Collaborate on data sourcing, transformation, and governance in the cloud Visualize insights and present findings to stakeholders Required Skills & Qualifications: Strong Python/R skills and experience with cloud ML stacks (AWS, GCP, or Azure) Familiarity with cloud-native data warehousing and storage (Redshift, BigQuery, Data Lake) Hands-on with model deployment, CI/CD, and A/B testing in the cloud Bonus: Background in NLP, time series, or geospatial analysis Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Reddy Delivery Manager Integra Technologies

Posted 1 week ago

Apply

3.0 - 6.0 years

20 - 25 Lacs

Bengaluru

Hybrid

Naukri logo

Join us as a Data Engineer II in Bengaluru! Build scalable data pipelines using Python, SQL, AWS, Airflow, and Kafka. Drive real-time & batch data systems across analytics, ML, and product teams. A hybrid work option is available. Required Candidate profile 3+ yrs in data engineering with strong Python, SQL, AWS, Airflow, Spark, Kafka, Debezium, Redshift, ETL & CDC experience. Must know data lakes, warehousing, and orchestration tools.

Posted 1 week ago

Apply

1.0 - 5.0 years

7 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Diverse Lynx is looking for Snowflake Developer to join our dynamic team and embark on a rewarding career journey A Snowflake Developer is responsible for designing and developing data solutions within the Snowflake cloud data platform They play a critical role in helping organizations to store, process, and analyze their data effectively and efficiently Responsibilities:Design and develop data solutions within the Snowflake cloud data platform, including data warehousing, data lake, and data modeling solutionsParticipate in the design and implementation of data migration strategiesEnsure the quality of custom solutions through the implementation of appropriate testing and debugging proceduresProvide technical support and troubleshoot issues as neededStay up-to-date with the latest developments in the Snowflake platform and data warehousing technologiesContribute to the ongoing improvement of development processes and best practices Requirements:Experience in data warehousing and data analyticsStrong knowledge of SQL and data warehousing conceptsExperience with Snowflake, or other cloud data platforms, is preferredAbility to analyze and interpret dataExcellent written and verbal communication skillsAbility to work independently and as part of a teamStrong attention to detail and ability to work in a fast-paced environment

Posted 1 week ago

Apply

4.0 - 8.0 years

30 - 37 Lacs

Bengaluru

Work from Office

Naukri logo

ECMS ID/ Title 525632 Number of Openings 1 Duration of contract 6 No of years experience Relevant 4-8 Years. Detailed job description - Skill Set: Attached Mandatory Skills* Azure Data Factory, PySpark notebooks, Spark SQL, and Python. Good to Have Skills ETL Processes, SQL, Azure Data Factory, Data Lake, Azure Synapse, Azure SQL, Databricks etc. Vendor Billing range 9000- 10000/Day Remote option available: Yes/ No Hybrid Mode Work location: Most Preferrable Pune and Hyderabad Start date: Immediate Client Interview / F2F Applicable yes Background check process to be followed: Before onboarding / After onboarding: BGV Agency: Post

Posted 1 week ago

Apply

10.0 - 13.0 years

30 - 40 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a highly skilled and experienced Data Engineering Manager to lead and grow our data platform team. The ideal candidate will have a proven track record in driving data-driven innovation, leading high-performing teams, and delivering impactful solutions. You will be responsible for setting the strategic direction for our data platform initiatives, collaborating with cross-functional teams, and fostering a culture of data-driven decision-making. What you will do Team Leadership: Recruit, hire, and develop a talented team of data scientists and machine learning engineers. Mentor and coach team members to foster professional growth and innovation. Strategic Leadership: Develop and execute the long-term data science strategy, aligning with the overall business goals. Collaborate with senior leadership to communicate the value of data science initiatives and secure necessary resources. Team Management: Oversee the end-to-end lifecycle of data engineering projects, from ideation to deployment and ongoing monitoring. Ensure projects are delivered on time, within budget, and meet the highest quality standards. Technical Expertise: Implement best practices in terms of coding, design and architecture Provide technical guidance and support to the data engineering team. What You Have 5+ years of team leadership and 10+ years of total data engineering experience working in partnership with large data sets Solid experience, developing and implementing DW Architectures, OLAP & OLTP technologies, data modeling with star/snowflake-schemas to enable self-service reporting and data lake. Hands-on deep experience with cloud data tech stack, and experience working with query engines like Redshift, EMR, AWS RDS or similar Experience building data solutions within any cloud platforms using Postgres, SQL Server, Oracle and other similar services and tools Advanced SQL and programming experience with Python and/or Spark Experience or demonstrated understanding with real-time data streaming tools like Kafka, Kinesis or any similar tools Familiarity with BI & dashboarding tools and multi-dimensional modeling Great problem-solving capabilities, troubleshooting data issues and experience in stabilizing big data systems Excellent communication and presentation skills as youll be regularly interacting with stakeholders and engineering leadership. Bachelors or master's in quantitative disciplines such as Computer Science, Computer Engineering, Analytics, Mathematics, Statistics, Information Systems, or other scientific fields. Nice to Have : Certification in one of the cloud platforms (AWS/GCP/Azure)

Posted 1 week ago

Apply

10.0 - 15.0 years

15 - 19 Lacs

Bengaluru

Work from Office

Naukri logo

Job Description We are seeking a highly skilled and experienced Data Architect to design, implement, and manage the data infrastructure As a Data Architect, you will play a key role in shaping the data strategy, ensuring data is accessible, reliable, and secure across the organization You will work closely with business stakeholders, data engineers, and analysts to develop scalable data solutions that support business intelligence, analytics, and operational needs, Key Responsibilities Design and implement effective database solutions (on-prem / cloud) and data models to store and retrieve data for various applications within FinCrime domain, Develop and maintain robust data architecture strategies aligned with business objectives, Define data standards, frameworks, and governance practices to ensure data quality and integrity, Collaborate with data engineers, software developers, and business stakeholders to integrate data systems and optimize data pipelines, Evaluate and recommend tools and technologies for data management, warehousing, and processing, Create and maintain documentation related to data models, architecture diagrams, and processes, Ensure data security and compliance with relevant regulations (e-g , GDPR, HIPAA, CCPA), Participate in capacity planning and growth forecasting for the organizations data infrastructure, Through various POCs, assess and compare multiple tooling options and deliver use-cases based on MVP model as per expectations, Requirements Experience: 10+ years of experience in data architecture, data engineering, or related roles, Proven experience with relational and NoSQL databases, Experience with FinCrime domain applications and reporting, Strong experience with ETL tools, data warehousing, and data lake solutions, Familiarity with other data technologies such as Spark, Kafka, Snowflake, Skills Strong analytical and problem-solving skills, Proficiency in data modelling tools (e-g , ER/Studio, Erwin), Excellent understanding of database management systems and data security, Knowledge of data governance, metadata management, and data lineage, Strong communication and interpersonal skills to collaborate across teams, Subject matter expertise within the FinCrime, Preferred Qualifications

Posted 1 week ago

Apply

7.0 - 12.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Conduct technical analyses of existing data pipelines, ETL processes, and on-premises/cloud system, identify technical bottlenecks, evaluate migration complexities, and propose optimizations. Desired Skills and experience Candidates should have a B.E./B.Tech/MCA/MBA in Finance, Information Systems, Computer Science or a related field 7+ years of experience as a Data and Cloud architecture with client stakeholders Strong experience in Synapse Analytics, Databricks, ADF, Azure SQL (DW/DB), SSIS. Strong experience in Advanced PS, Batch Scripting, C# (.NET 3.0). Expertise on Orchestration systems with ActiveBatch and AZ orchestration tools. Strong understanding of data warehousing, DLs, and Lakehouse concepts. Excellent communication skills, both written and verbal Extremely strong organizational and analytical skills with strong attention to detail Strong track record of excellent results delivered to internal and external clients Able to work independently without the needs for close supervision and collaboratively as part of cross-team efforts Experience with delivering projects within an agile environment Experience in project management and team management Key responsibilities include: Understand and review PowerShell (PS), SSIS, Batch Scripts, and C# (.NET 3.0) codebases for data processes. Assess the complexity of trigger migration across Active Batch (AB), Synapse, ADF, and Azure Databricks (ADB). Define usage of Azure SQL DW, SQL DB, and Data Lake (DL) for various workloads, proposing transitions where beneficial. Analyze data patterns for optimization, including direct raw-to-consumption loading and zone elimination (e.g., stg/app zones). Understand requirements for external tables (Lakehouse) Lead project deliverables, ensuring actionable and strategic outputs. Evaluate and ensure quality of deliverables within project timelines Develop a strong understanding of equity market domain knowledge Collaborate with domain experts and business stakeholders to understand business rules/logics Ensure effective, efficient, and continuous communication (written and verbally) with global stakeholders Independently troubleshoot difficult and complex issues on dev, test, UAT and production environments Responsible for end-to-end delivery of projects, coordination between client and internal offshore teams and manage client queries Demonstrate high attention to detail, should work in a dynamic environment whilst maintaining high quality standards, a natural aptitude to develop good internal working relationships and a flexible work ethic Responsible for Quality Checks and adhering to the agreed Service Level Agreement (SLA) / Turn Around Time (TAT)

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies