Home
Jobs

984 Data Bricks Jobs - Page 32

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role : - Minimum 4 years of experience in relevant field. - Hands on experience in Databricks, SQL, Azure Data Factory, Azure DevOps - Strong expertise in Microsoft Azure cloud platform services (Azure Data Factory, Azure Data Bricks, Azure SQL Database, Azure Data Lake Storage, Azure Synapse Analytics). - Proficient in CI-CD pipelines in Azure DevOps for automatic deployments - Good in Performance optimization techniques like using temp tables, CTE, indexing, merge statements, joins. - Familiarity in Advanced SQL and programming skills (e.g., Python, Pyspark). - Familiarity with data warehousing and data modelling concepts. - Good in Data management and deployment processes using Azure Data factory and Databricks, Azure DevOps. - Knowledge on integrating every azure service with DevOps - Experience in designing and implementing scalable data architectures. - Proficient in ETL processes and tools. - Strong communication and collaboration skills. - Certifications in relevant Azure technologies are a plus Location Bangalore/ Hyderabad Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 1 month ago

Apply

3.0 - 7.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Who We Are Applied Materials is the global leader in materials engineering solutions used to produce virtually every new chip and advanced display in the world. We design, build and service cutting-edge equipment that helps our customers manufacture display and semiconductor chips- the brains of devices we use every day. As the foundation of the global electronics industry, Applied enables the exciting technologies that literally connect our world- like AI and IoT. If you want to work beyond the cutting-edge, continuously pushing the boundaries of"science and engineering to make possible"the next generations of technology, join us to Make Possible® a Better Future. What We Offer Location: Bangalore,IND At Applied, we prioritize the well-being of you and your family and encourage you to bring your best self to work. Your happiness, health, and resiliency are at the core of our benefits and wellness programs. Our robust total rewards package makes it easier to take care of your whole self and your whole family. Were committed to providing programs and support that encourage personal and professional growth and care for you at work, at home, or wherever you may go. Learn more about our benefits . Youll also benefit from a supportive work culture that encourages you to learn, develop and grow your career as you take on challenges and drive innovative solutions for our customers."We empower our team to push the boundaries of what is possible"”while learning every day in a supportive leading global company. Visit our Careers website to learn more about careers at Applied. Key Responsibilities Provide technical support for applications built using .Net as well as Angular, React and other open source technologies. Troubleshoot and resolve issues related to Front End, APIs and backend services. Collaborate with development teams to understand and resolve technical issues Assist in the deployment and maintenance of software applications. Ensure the performance, quality, and responsiveness of applications and apply permanent fixes to the critical and recurring issues Help maintain code quality, organization, and automation. Perform design reviews with the respective development for critical applications and provide inputs Document support processes and solutions for future reference. Stay up-to-date with the latest industry trends and technologies. Required Skills and Qualifications: Bachelors degree in Computer Science, Engineering, or a related field. 8+ years of experience in software development and support. Strong proficiency in .Net, Angular, React, Proficient in Python for backend support Familiarity in Hadoop Ecosystem as well as Databricks Experience with RESTful APIs and web services. Solid understanding of front-end technologies, including HTML5, CSS3, and JavaScript as well as Azure, AWS Strong Background in SQL Server and other relational databases Familiarity with version control systems (e.g., Git) as well as Atlassian Products for Software Development and Code Deployment Mechanisms/DevOps Best practices in hosting the applications in containerized platforms like OCP (onprem and cloud) etc Experience with open-source projects and contributions. Strong problem-solving skills and attention to detail. Excellent communication and teamwork skills. Certifications in relevant areas specially Microsoft will be a plus Functional Knowledge Demonstrates conceptual and practical expertise in own discipline and knowledge of Semiconductor industry is nice to have interpersonal Skills Explains difficult or sensitive information; works to build consensus Additional Information Time Type: Full time Employee Type: Assignee / Regular Travel: Yes, 10% of the Time Relocation Eligible: Yes Applied Materials is an Equal Opportunity Employer. Qualified applicants will receive consideration for employment without regard to race, color, national origin, citizenship, ancestry, religion, creed, sex, sexual orientation, gender identity, age, disability, veteran or military status, or any other basis prohibited by law.

Posted 1 month ago

Apply

3.0 - 6.0 years

0 Lacs

, India

On-site

Foundit logo

About the Role: 09 Job Description: We are seeking a skilled and motivated Application Operations Engineer for an SRE role with Java, React JS and Spring boot skillset along with expertise in Data Bricks, particularly with Oracle integration, to join our dynamic SRE team. The ideal candidate should have 3 to 6 years of experience in supporting robust web applications using Java, React JS and Spring boot with a strong background in managing and optimizing data workflows leveraging Oracle databases. The incumbent will be responsible for supporting applications, troubleshooting issues, providing RCA's and suggestive fixes by managing continuous integration and deployment pipelines, automating processes, and ensuring systems reliability, maintainability and stability. Responsibilities: The incumbent will be working in CI/CD, handle Infrastructure issues, know how on supporting Operations and maintain user-facing features using React JS, Spring boot & Java Has ability to support reusable components and front-end libraries for future use Partner with development teams to improve services through rigorous testing and release procedures. Has willingness to learn new tools and technologies as per the project demand. Ensure the technical feasibility of UI/UX designs Optimize applications for maximum speed and scalability Collaborate with other team members and stakeholders Work closely with data engineers to ensure smooth data flow and integration. Create and maintain documentation for data processes and workflows. Troubleshoot and resolve issues related to data integrity and performance. Good to have working knowledge on Tomcat App server and Apache web server, Oracle, Postgres Command on Linux & Unix. Self-driven individual Requirements : Bachelor's degree in computer science engineering, or a related field 3-6 years of professional experience Proficiency in Advanced Java, JavaScript, including DOM manipulation and the JavaScript object model Experience with popular React JS workflows (such as Redux, MobX, Flux) Familiarity with RESTful APIs Experience with cloud platforms such as AWS and Azure Knowledge of CI/CD pipelines and DevOps practices Experience with data engineering tools and technologies, particularly Data Bricks Proficiency in Oracle database technologies and SQL queries Excellent problem-solving skills and attention to detail Ability to work independently and as part of a team Good verbal and written communication skills Familiarity with ITSM processes like Incident, Problem and Change Management using ServiceNow (preferable) Ability to work in shift manner. Grade - 09 Location - Hyderabad Hybrid Mode - twice a week work from office Shift Time - 6:30 am to 1 pm OR 2 pm to 10 pm IST S&P Global Ratings is a division of S&P Global (NYSE: SPGI). S&P Global is the world's foremost provider of credit ratings, benchmarks, analytics and workflow solutions in the global capital, commodity and automotive markets. With every one of our offerings, we help many of the world's leading organizations navigate the economic landscape so they can plan for tomorrow, today. For more information, visit What's In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology-the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide-so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We're committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We're constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That's why we provide everything you-and your career-need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It's not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards-small perks can make a big difference. For more information on benefits by country visit: Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. S&P Global has a Securities Disclosure and Trading Policy (the Policy) that seeks to mitigate conflicts of interest by monitoring and placing restrictions on personal securities holding and trading. The Policy is designed to promote compliance with global regulations. In some Divisions, pursuant to the Policy's requirements, candidates at S&P Global may be asked to disclose securities holdings. Some roles may include a trading prohibition and remediation of positions when there is an effective or potential conflict of interest. Employment at S&P Global is contingent upon compliance with the Policy. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group)

Posted 1 month ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Gurugram

Work from Office

Naukri logo

Description Data Analyst II Syneos Healthis a leading fully integrated biopharmaceutical solutions organization built to accelerate customer success We translate unique clinical, medical affairs and commercial insights into outcomes to address modern market realities, Every day we perform better because of how we work together, as one team, each the best at what we do We bring a wide range of talented experts together across a wide range of business-critical services that support our business Every role within Corporate is vital to furthering our vision of Shortening the Distance from Lab to Life, JOB SUMMARYThe Data Analyst II supports our business goals by analyzing datasets and providing recommendations to maximize efficiency & effectiveness for our project teams and customers This role also plays a key part in tracking product approvals, measuring customer satisfaction, supporting business development efforts, and maintaining key performance data to drive strategic decision-making, Job Responsibilities Work independently to solve open-ended questions, Design and analyze tests and experiments, Maintain documentation of analytical processes and projects, Build, maintain, and improve performance dashboards leveraging customer feedback for use and accessibility, Advise clients on relevant best practices and ensure the data is easily retrievable for their review, Support data quality and understanding customer needs as they evolve, Mentor and coach junior team members, Support site advocacy group meetings by inviting PIs, discussing blinded protocols, collecting feedback, and managing scheduling, hosting, and meeting minutes, Develop and manage capabilities decks twice annually, along with bespoke slides and marketing information sheets using Power BI data, Track and analyze business development outcomes through opportunity trackers, monitoring RFP success rates, regulatory approvals, and win rates, Monitor customer satisfaction by reviewing feedback from the EM team and facilitating monthly cross-time zone communications, Oversee product approval tracking, ensuring visibility into product lifecycle status and final approval outcomes, Qualifications QUALIFICATION REQUIREMENTS Bachelors Degree in a related field such as Computer Science or Statistics, Strong data manipulation skills: querying and manipulating data with SQL; advanced MS Excel skills (VLOOKUP, functions, dashboards, Power Pivot) and knowledge of Python or R, Experience with A/B conversion testing, Cross-functional collaboration experience with IT and data engineering teams to ensure the infrastructure supports scalable, efficient data analysis, Concise and clear written and oral communication Proven experience with delivering insights and reports with PowerPoint slides to customers Strong attention to detail, Experience building dashboards in PowerBI or Tableau, Preferred Knowledge of clinical decision support systems and healthcare operational workflows, Familiarity with cloud platforms (Azure, AWS) for data storage and analysis, Experience working with Databricks, Apache Spark, and ETL pipelines for large-scale data processing and analytics, Get to know Syneos Health Over the past 5 years, we have worked with 94% of all Novel FDA Approved Drugs, 95% of EMA Authorized Products and over 200 Studies across 73,000 Sites and 675,000+ Trial patients, No matter what your role is, youll take the initiative and challenge the status quo with us in a highly competitive and ever-changing environment Learn more about Syneos Health

Posted 1 month ago

Apply

5.0 - 9.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Dreaming big is in our DNA Its who we are as a company Its our culture Its our heritage And more than ever, its our future A future where were always looking forward Always serving up new ways to meet lifes moments A future where we keep dreaming bigger We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential The power we create together when we combine your strengths with ours is unstoppable Are you ready to join a team that dreams as big as you do AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology The teams are transforming Operations through Tech and Analytics, Do You Dream Big We Need You, Job Description Job Title: Senior ML Engineer Location: Bangalore Reporting to: Director Data Analytics Purpose of the role Anheuser-Busch InBev (AB InBev)s Supply Analytics is responsible for building competitive differentiated solutions that enhance brewery efficiency through data-driven insights We optimize processes, reduce waste, and improve productivity by leveraging advanced analytics and AI-driven solutions, Senior MLE, will be responsible for the end-to-end deployment of machine learning models on edge devices You will take ownership of all aspects of edge deployment, including model optimization, scaling complexities, containerization, and infrastructure management, ensuring high availability and performance, Key tasks & accountabilities Lead the entire edge deployment lifecycle, from model training to deployment and monitoring on edge devices Develop, and maintain a scalable Edge ML pipeline that enables real-time analytics at brewery sites, Optimize and containerize models using Portainer, Docker, and Azure Container Registry (ACR) to ensure efficient execution in constrained edge environments, Own and manage the GitHub repository, ensuring structured, well-documented, and modularized code for seamless deployments, Establish robust CI/CD pipelines for continuous integration and deployment of models and services, Implement logging, monitoring, and alerting for deployed models to ensure reliability and quick failure recovery Ensure compliance with security and governance best practices for data and model deployment in edge environments, Document the thought process & create artifacts on team repo/wiki that can be used to share with business & engineering for sign off, Review code quality and design developed by the peers, Significantly improve the performance & reliability of our code that creates high quality & reproducible results, Develop internal tools/utils that improve productivity of entire team, Collaborate with other team members to advance the teams ability to ship high quality code fast! Mentor/coach junior team members to continuously upskill them, Maintain basic developer hygiene that includes but not limited to, writing tests, using loggers, readme to name a few, Qualifications, Experience, Skills Level of educational attainment required (1 or more of the following) Academic degree in, but not limited to, Bachelors or master's in computer application, Computer science, or any engineering discipline, Previous Work Experience 5+ years of real-world experience to develop scalable & high-quality ML models, Strong problem-solving skills with an owners mindset?proactively identifying and resolving bottlenecks, Technical Skills Required Proficiency with pandas, NumPy, SciPy, scikit-learn, stats models, TensorFlow, Good understanding of statistical computing, parallel processing, Experience with advanced TensorFlow distributed, NumPy, joblib, Good understanding of memory management & parallel processing in python, Profiling & optimization of production code, Strong at Python coding Exposure to working in IDEs such as VSC or PyCharm, Experience in code versioning using Git, maintaining modularized code base for multiple deployments, Experience in working in an Agile environment, In depth understand of data bricks (Workflows, cluster creation, repo management), In depth understanding of machine learning solution in Azure cloud, Best practices in coding standards, unit testing, and automation, Proficiency in Docker, Kubernetes, Portainer, and container orchestration for edge computing, Other Skills Required Experience in real-time analytics and edge AI deployments Exposure to DevOps practices, including infrastructure automation and monitoring tools Contributions to OSS or Stack overflow, And above all of this, an undying love for beer! We dream big to create future with more cheers

Posted 1 month ago

Apply

10.0 - 15.0 years

30 - 45 Lacs

Pune

Work from Office

Naukri logo

Azure Cloud Data Solutions Architect Job Title: Azure Cloud Data Solutions Architect Location: Pune, India Experience: 10 - 15 Years Work Mode: Full-time, Office-based Company : Smartavya Analytica Private Limited Company Overview: Smartavya Analytica is a niche Data and AI company based in Mumbai, established in 2017. We specialize in data-driven innovation, transforming enterprise data into strategic insights. With expertise spanning over 25+ Data Modernization projects and handling large datasets up to 24 PB in a single implementation, we have successfully delivered data and AI projects across multiple industries, including retail, finance, telecom, manufacturing, insurance, and capital markets. We are specialists in Cloud, Hadoop, Big Data, AI, and Analytics, with a strong focus on Data Modernization for On-premises, Private, and Public Cloud Platforms. Visit us at: https://smart-analytica.com Job Summary: We are seeking an experienced Azure Cloud Data Solutions Architect to lead end-to-end architecture and delivery of enterprise-scale cloud data platforms. The ideal candidate will have deep expertise in Azure Data Services , Data Engineering , and Data Governance , with the ability to architect and guide cloud modernization initiatives. Key Responsibilities: Architect and design data lakehouses , data warehouses , and analytics platforms using Azure Data Services . Lead implementations using Azure Data Factory (ADF) , Azure Synapse Analytics , and Azure Fabric (OneLake ecosystem). Define and implement data governance frameworks including cataloguing, lineage, security, and quality controls. Collaborate with business stakeholders, data engineers, and developers to translate business requirements into scalable Azure architectures. Ensure platform design meets performance, scalability, security, and regulatory compliance needs. Guide migration of on-premises data platforms to Azure Cloud environments. Create architectural artifacts: solution blueprints, reference architectures, governance models, and best practice guidelines. Collaborate with Sales / presales to customer meetings to understand the business requirement, the scope of work and propose relevant solutions. Drive the MVP/PoC and capability demos to prospective customers / opportunities Must-Have Skills: 1015 years of experience in data architecture, data engineering, or analytics solutions. Hands-on expertise in Azure Cloud services: ADF , Synapse , Azure Fabric (OneLake) , and Databricks (good to have). Strong understanding of data governance , metadata management, and compliance frameworks (e.g., GDPR, HIPAA). Deep knowledge of relational and non-relational databases (SQL, NoSQL) on Azure. Experience with security practices (IAM, RBAC, encryption, data masking) in cloud environments. Strong client-facing skills with the ability to present complex solutions clearly. Preferred Certifications: Microsoft Certified: Azure Solutions Architect Expert Microsoft Certified: Azure Data Engineer Associate

Posted 1 month ago

Apply

1.0 - 3.0 years

11 - 15 Lacs

Mumbai

Work from Office

Naukri logo

Overview The Data Technology team at MSCI is responsible for meeting the data requirements across various business areas, including Index, Analytics, and Sustainability. Our team collates data from multiple sources such as vendors (e.g., Bloomberg, Reuters), website acquisitions, and web scraping (e.g., financial news sites, company websites, exchange websites, filings). This data can be in structured or semi-structured formats. We normalize the data, perform quality checks, assign internal identifiers, and release it to downstream applications. Responsibilities As data engineers, we build scalable systems to process data in various formats and volumes, ranging from megabytes to terabytes. Our systems perform quality checks, match data across various sources, and release it in multiple formats. We leverage the latest technologies, sources, and tools to process the data. Some of the exciting technologies we work with include Snowflake, Databricks, and Apache Spark. Qualifications Core Java, Spring Boot, Apache Spark, Spring Batch, Python. Exposure to sql databases like Oracle, Mysql, Microsoft Sql is a must. Any experience/knowledge/certification on Cloud technology preferrably Microsoft Azure or Google cloud platform is good to have. Exposures to non sql databases like Neo4j or Document database is again good to have. What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com

Posted 1 month ago

Apply

6.0 - 11.0 years

19 - 27 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

We are looking for "Azure Data bricks Engineer" with Minimum 6 years experience Contact- Atchaya (95001 64554) Required Candidate profile Exp in Azure Data bricks and Python Must Have Data bricks Python Azure The Candidate must have 7-10 yrs of experience in data bricks, delta lake Hands-on exp on Azure Exp on Python scripting

Posted 1 month ago

Apply

7.0 - 12.0 years

27 - 42 Lacs

Chennai

Work from Office

Naukri logo

Azure Databricks/Datafactory Working with event based / streaming technologies to ingest and process data Working with other members of the project team to support delivery of additional project components (API interfaces, Search). Evaluating the performance and applicability of multiple tools against customer requirements Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. Strong knowledge of Data Management principles Experience in building ETL / data warehouse transformation processes Direct experience of building data piplines using Databricks. Experience using geospatial frameworks on Apache Spark and associated design and development patterns Experience working in a Dev/Ops environment with tools such as Terraform

Posted 1 month ago

Apply

5.0 - 8.0 years

20 - 35 Lacs

Mumbai, Pune

Hybrid

Naukri logo

About Company: Freestone Infotech is a global IT solutions company providing innovative best-in-class turnkey solutions to enterprises worldwide. Freestone Infotech addresses the enterprise-wide end-to-end needs of organizations with its expertise in Big Data Solutions, Data Analysis, Machine Learning, Business Intelligence, R&D, Product development, and Mobile Application development. Job Overview: As a Senior Software engineer at Freestone Infotech Pvt Ltd. In this role, you should be able to work independently with little supervision. You should have excellent organization and problem-solving skills. If you also have hands-on experience in software development and agile methodologies, wed like to meet you. Job Title : Senior Software Engineer Experience : 5 to 8 years Your experience should include: 5+ years of experience in java and related technologies. 5+ years of experience in software development. Experience with k8s/docker deployment. Experience of Maven Experience in SQL queries Experience with Linux and bash scripting Knowledge of version control (Git etc) Experience with Jenkins and CI/CD pipelines. Experience with JUnit/Mockito for testing. Familiarity with RESTful API development. Experience in Java Multi-Threading development. Nice to have: Experience of Apache Ranger and data access/governance domain Experience of microservices, python, scala Experience with Open Telemetry for monitoring and metrics. Experience with Grafana for visualization and monitoring. Experience in Python Testing framework i.e pytes. Performance testing framework tool locust. Experience of cloud services ADLS, S3, GCP. Experience of big data technologies such as Apache Spark, Apache Hive, EMR. Experience of Snowflake/Databricks/Lake formation. Education: Bachelors / master’s degree in computer science or information technology or a related field.

Posted 1 month ago

Apply

12.0 - 22.0 years

25 - 40 Lacs

Bangalore Rural, Bengaluru

Work from Office

Naukri logo

Role & responsibilities Requirements: Data Modeling (Conceptual, Logical, Physical)- Minimum 5 years Database Technologies (SQL Server, Oracle, PostgreSQL, NoSQL)- Minimum 5 years Cloud Platforms (AWS, Azure, GCP) - Minimum 3 Years ETL Tools (Informatica, Talend, Apache Nifi) - Minimum 3 Years Big Data Technologies (Hadoop, Spark, Kafka) - Minimum 5 Years Data Governance & Compliance (GDPR, HIPAA) - Minimum 3 years Master Data Management (MDM) - Minimum 3 years Data Warehousing (Snowflake, Redshift, BigQuery)- Minimum 3 years API Integration & Data Pipelines - Good to have. Performance Tuning & Optimization - Minimum 3 years business Intelligence (Power BI, Tableau)- Minimum 3 years Job Description: We are seeking experienced Data Architects to design and implement enterprise data solutions, ensuring data governance, quality, and advanced analytics capabilities. The ideal candidate will have expertise in defining data policies, managing metadata, and leading data migrations from legacy systems to Microsoft Fabric/DataBricks/ . Experience and deep knowledge about at least one of these 3 platforms is critical. Additionally, they will play a key role in identifying use cases for advanced analytics and developing machine learning models to drive business insights. Key Responsibilities: 1. Data Governance & Management Establish and maintain a Data Usage Hierarchy to ensure structured data access. Define data policies, standards, and governance frameworks to ensure consistency and compliance. Implement Data Quality Management practices to improve accuracy, completeness, and reliability. Oversee Metadata and Master Data Management (MDM) to enable seamless data integration across platforms. 2. Data Architecture & Migration Lead the migration of data systems from legacy infrastructure to Microsoft Fabric. Design scalable, high-performance data architectures that support business intelligence and analytics. Collaborate with IT and engineering teams to ensure efficient data pipeline development. 3. Advanced Analytics & Machine Learning Identify and define use cases for advanced analytics that align with business objectives. Design and develop machine learning models to drive data-driven decision-making. Work with data scientists to operationalize ML models and ensure real-world applicability. Required Qualifications: Proven experience as a Data Architect or similar role in data management and analytics. Strong knowledge of data governance frameworks, data quality management, and metadata management. Hands-on experience with Microsoft Fabric and data migration from legacy systems. Expertise in advanced analytics, machine learning models, and AI-driven insights. Familiarity with data modelling, ETL processes, and cloud-based data solutions (Azure, AWS, or GCP). Strong communication skills with the ability to translate complex data concepts into business insights. Preferred candidate profile Immediate Joiner

Posted 1 month ago

Apply

5.0 - 10.0 years

12 - 22 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Job Title: ======= Microsoft ETL Developer - Microsoft SSIS / Informatica x4 positions Onsite Location: ============= Dubai, UAE Doha , Qatar Riyadh, Saudi Onsite Monthly Salary: ============== 10k AED - 15k AED Offshore Location: =============== Pune / Hyderabad / Chennai / Bangalore / Mumbai Offshore Annual Salary: ============== 12 LPA - 20 LPA Note: ===== You need to travel to onsite (UAE) on needful basis Project duration: ============= 2 Years Initially Desired Experience Level Needed: =========================== 5 - 10 Years Qualification: ========== B.Tech / M.Tech / MCA / M.Sc or equivalent Experience Needed: =============== Over all: 5 or more Years of total IT experience Solid 3+ Years experience or more - as ETL Developer with Microsoft SSIS / Informatica as ETL Developer Engineer Job Responsibilities: ================ - Design and develop ETL data flows - Design Microsoft ETL packages - Able to code T-SQL - Able to create Orchestrations - Able to design batch jobs / Orchestrations runs - Familiarity with data models - Able to develop MDM (Master Data Management) and design SCD-1/2/3 as per client requirements Experience: ================ - Experience as ETL Developer with Microsoft SSIS - Exposure and experience with Azure services including Azure Data Factory - Sound knowledge of BI practices and visualization tools such as PowerBI / SSRS/ QlikView - Collecting / gathering data from various multiple source systems - Load the data using ETL - Creating automated data pipelines - Configuring Azure resources and services Skills: ================ - Microsoft SSIS - Informatica - Azure Data Factory - Spark - SQL Nice to have: ========== - Any on site experience is added advantage, but not mandatory - Microsoft certifications are added advantage Business Vertical: ============== - Banking / Investment Banking - Capital Markets - Securities / Stock Market Trading - Bonds / Forex Trading - Credit Risk - Payments Cards Industry (VISA/ Master Card/ Amex) Job Code: ====== ETL_DEVP_0525 No.of positions: ============ 04 Email: ===== spectrumconsulting1977@gmail.com if you are interested, please email your CV as ATTACHMENT with job ref. code [ ETL_DEVP_0525 ] as subject

Posted 1 month ago

Apply

9.0 - 14.0 years

50 - 85 Lacs

Noida

Work from Office

Naukri logo

About the Role We are looking for a Staff Engineer specialized in Master Data Management to design and develop our next-generation MDM platform. This role is ideal for engineers who have created or contributed significantly to MDM solutions. Youll lead the architecture and development of our core MDM engine, focusing on data modeling, matching algorithms, and governance workflows that enable our customers to achieve a trusted, 360-degree view of their critical business data. A Day in the Life Collaborate with data scientists, product managers, and engineering teams to define system architecture and design. Architect and develop scalable, fault-tolerant MDM platform components that handle various data domains. Design and implement sophisticated entity matching and merging algorithms to create golden records across disparate data sources. Develop or Integrate flexible data modeling frameworks that can adapt to different industries and use cases. Create robust data governance workflows, including approval processes, audit trails, and role-based access controls. Build data quality monitoring and remediation capabilities into the MDM platform. Collaborate with product managers, solution architects, and customers to understand industry-specific MDM requirements. Develop REST APIs and integration patterns for connecting the MDM platform with various enterprise systems. Mentor junior engineers and promote best practices in MDM solution development. Lead technical design reviews and contribute to the product roadmap What You Need 8+ years of software engineering experience, with at least 5 years focused on developing master data management solutions or components. Proven experience creating or significantly contributing to commercial MDM platforms, data integration tools, or similar enterprise data management solutions. Deep understanding of MDM concepts including data modeling, matching/merging algorithms, data governance, and data quality management. Strong expertise in at least one major programming language such as Java, Scala, Python, or Go. Experience with database technologies including relational (Snowflake, Databricks, PostgreSQL) and NoSQL systems (MongoDB, Elasticsearch). Knowledge of data integration patterns and ETL/ELT processes. Experience designing and implementing RESTful APIs and service-oriented architectures. Understanding of cloud-native development and deployment on AWS, or Azure. Familiarity with containerization (Docker) and orchestration tools (Kubernetes). Experience with event-driven architectures and messaging systems (Kafka, RabbitMQ). Strong understanding of data security and privacy considerations, especially for sensitive master data.

Posted 1 month ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Naukri logo

Dear Candidate, Looking for a Cloud Data Engineer to build cloud-based data pipelines and analytics platforms. Key Responsibilities: Develop ETL workflows using cloud data services. Manage data storage, lakes, and warehouses. Ensure data quality and pipeline reliability. Required Skills & Qualifications: Experience with BigQuery, Redshift, or Azure Synapse. Proficiency in SQL, Python, or Spark. Familiarity with data lake architecture and batch/streaming. Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 1 month ago

Apply

6.0 - 11.0 years

19 - 27 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

We are looking for "AWS Data bricks Data Engineer" with Minimum 6 years experience Contact- Atchaya (95001 64554) Required Candidate profile Data Engineer (AWS) with expertise in processing data pipelines using Data bricks, PySpark SQL on Cloud distributions like AWS Must have AWS Data bricks PySpark, Snowflake, Talend

Posted 1 month ago

Apply

6.0 - 9.0 years

10 - 16 Lacs

Pune

Work from Office

Naukri logo

Lead end-to-end delivery of Azure-based data projects, ensuring timely execution and quality outcomes using Azure Data Factory, Databricks, and PySpark. Manage and mentor a team, providing technical guidance and ensuring smooth production support. Required Candidate profile Skills:Azure Data Factory, Databricks, Data warehouse, Production support, and PySpark Strong Delivery exp + Team leading exp(minimum 2 years)+Someone who has led an Azure project for at least 2 years

Posted 1 month ago

Apply

5.0 - 8.0 years

7 - 13 Lacs

Kolkata

Work from Office

Naukri logo

Job Summary: We are seeking a skilled and motivated Data Engineer with 3-5 years of experience to join our growing data team. The ideal candidate will be responsible for designing, developing, testing, deploying, and maintaining robust, scalable, and efficient data pipelines and infrastructure. You will work closely with data scientists, analysts, software engineers, and business stakeholders to understand data requirements and deliver high-quality data solutions that drive business insights and decisions. Key Responsibilities: Design, build, and maintain scalable and reliable ETL/ELT data pipelines to ingest, transform, and load data from diverse sources (e.g., relational databases, APIs, streaming platforms, flat files). Develop and manage data warehousing solutions, ensuring data integrity, optimal performance, and cost-effectiveness. Implement data models, data schemas, and data dictionaries to support business and analytical requirements. Ensure data quality, consistency, and accuracy across all data systems by implementing data validation, cleansing, and monitoring processes. Optimize data pipeline performance and troubleshoot data-related issues. Collaborate with data scientists and analysts to provide them with clean, well-structured, and readily accessible data for their analysis and modelling needs. Implement and maintain data security and governance best practices. Automate data processes and workflows using scripting and orchestration tools. Document data pipelines, architectures, and processes. Stay up to date with emerging data technologies and best practices, and recommend improvements to our existing data stack. Required Skills & Qualifications: Bachelors or master’s degree in computer science, Engineering, Information Systems, or a related technical field. 5-8 years of hands-on experience in a Data Engineering role. Strong proficiency in SQL and experience with relational databases (e.g., PostgreSQL, MySQL, SQL Server). Proficiency in Python. Experience with building and optimizing data pipelines using ETL/ELT tools and frameworks (e.g., Apache Airflow, dbt, Informatica, Talend, custom scripts). Hands-on experience with big data technologies (e.g., Apache Spark, Hadoop ecosystem - HDFS, MapReduce, Hive). Experience with cloud platforms (e.g., Azure - ADLS, Databricks, Synapse; GCP - GCS, BigQuery, Dataflow). Understanding of data warehousing concepts and experience with data warehouse solutions (e.g., Snowflake, Redshift, BigQuery, Synapse Analytics). Familiarity with NoSQL databases (e.g., MongoDB, Cassandra) is a plus. Experience with version control systems (e.g., Git). Strong analytical and problem-solving skills. Excellent communication and collaboration skills, with the ability to work effectively in a team environment. Ability to manage multiple tasks and projects simultaneously. Preferred/Bonus Skills: Experience with real-time data streaming technologies (e.g., Apache Kafka, Kinesis, Flink, Spark Streaming). Knowledge of containerization and orchestration (e.g., Docker, Kubernetes). Familiarity with CI/CD pipelines for data engineering

Posted 1 month ago

Apply

3 - 8 years

7 - 15 Lacs

Pune

Hybrid

Naukri logo

What were looking for: Work collaboratively with Project Management, Operations, Development and Engineering to deliver all contractually required services. Investigates and responds to questions for assigned product lines. Provides data quality analysis and data mining on client-specific data and general healthcare data. Reviews testing executed by multiple groups Lead the development of data quality strategy Assist the Connector Engineering/Tools team with new business requirements and help them with Functional Specifications. Identify, document and refine standard processes Meeting with the software design team to discuss verification protocols. Identifying software application weaknesses and target areas. Sketching out ideas for automated software test procedures. Reviewing software bug reports and highlighting problem areas. Writing automation scripts and implementing software applications. Designing and installing Automated QA applications and dashboards. Identifying quality issues and creating test reports. Collaborating with the design team to solve application faults. Work with teams across regions (India and Nepal), and help facilitate workstreams Requirements: 3+ years' experience with Data Quality Strong data manipulation skills. Strong querying skills with SQL and data interigation Strong knowledge of healthcare enrollment, medical claims, and drug claims data required. Additional knowledge of non-traditional data types (health & wellness, workforce productivity, and EMR) preferred. Excellent written and verbal communication skills, with the ability to multitask and prioritize projects to meet scheduled deadlines. Strong interpersonal skills required. Ability to work well independently or in a team environment, and mentor other team members. Advanced programming skills including automation systems and databases. Familiarity with programming script languages including Java and Python.

Posted 1 month ago

Apply

2 - 5 years

2 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Databricks Engineer Full-time DepartmentDigital, Data and Cloud Company Description Version 1 has celebrated over 26+ years in Technology Services and continues to be trusted by global brands to deliver solutions that drive customer success. Version 1 has several strategic technology partners including Microsoft, AWS, Oracle, Red Hat, OutSystems and Snowflake. Were also an award-winning employer reflecting how employees are at the heart of Version 1. Weve been awardedInnovation Partner of the Year Winner 2023 Oracle EMEA Partner Awards, Global Microsoft Modernising Applications Partner of the Year Award 2023, AWS Collaboration Partner of the Year - EMEA 2023 and Best Workplaces for Women by Great Place To Work in UK and Ireland 2023. As a consultancy and service provider, Version 1 is a digital-first environment and we do things differently. Were focused on our core values; using these weve seen significant growth across our practices and our Digital, Data and Cloud team is preparing for the next phase of expansion. This creates new opportunities for driven and skilled individuals to join one of the fastest-growing consultancies globally. About The Role This is an exciting opportunity for an experienced developer of large-scale data solutions. You will join a team delivering a transformative cloud hosted data platform for a key Version 1 customer. The ideal candidate will have a proven track record as a senior/self-starting data engineerin implementing data ingestion and transformation pipelines for large scale organisations. We are seeking someone with deep technical skills in a variety of technologies, specifically SPARK performanceuning\optimisation and Databricks , to play an important role in developing and delivering early proofs of concept and production implementation. You will ideally haveexperience in building solutions using a variety of open source tools & Microsoft Azure services, and a proven track record in delivering high quality work to tight deadlines. Your main responsibilities will be: Designing and implementing highly performant metadata driven data ingestion & transformation pipelines from multiple sources using Databricks and Spark Streaming and Batch processes in Databricks SPARK performanceuning\optimisation Providing technical guidance for complex geospatial problems and spark dataframes Developing scalable and re-usable frameworks for ingestion and transformation of large data sets Data quality system and process design and implementation. Integrating the end to end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is maintained at all times Working with other members of the project team to support delivery of additional project components (Reporting tools, API interfaces, Search) Evaluating the performance and applicability of multiple tools against customer requirements Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. Qualifications Direct experience of building data piplines using Azure Data Factory and Databricks Experience Required is 6 to 8 years. Building data integration with Python Databrick Engineer certification Microsoft Azure Data Engineer certification. Hands on experience designing and delivering solutions using the Azure Data Analytics platform. Experience building data warehouse solutions using ETL / ELT tools like Informatica, Talend. Comprehensive understanding of data management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching. Nice to have Experience working in a Dev/Ops environment with tools such as Microsoft Visual Studio Team Services, Chef, Puppet or Terraform Experience working with structured and unstructured data including imaging & geospatial data. Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience with Azure Event Hub, IOT Hub, Apache Kafka, Nifi for use with streaming data / event-based data Additional Information At Version 1, we believe in providing our employees with a comprehensive benefits package that prioritises their well-being, professional growth, and financial stability. One of our standout advantages is the ability to work with a hybrid schedule along with business travel, allowing our employees to strike a balance between work and life. We also offer a range of tech-related benefits, including an innovative Tech Scheme to help keep our team members up-to-date with the latest technology. We prioritise the health and safety of our employees, providing private medical and life insurance coverage, as well as free eye tests and contributions towards glasses. Our team members can also stay ahead of the curve with incentivized certifications and accreditations, including AWS, Microsoft, Oracle, and Red Hat. Our employee-designed Profit Share scheme divides a portion of our company's profits each quarter amongst employees. We are dedicated to helping our employees reach their full potential, offering Pathways Career Development Quarterly, a programme designed to support professional growth. Cookies Settings

Posted 1 month ago

Apply

5 - 9 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

locationsPune Iposted onPosted 30+ Days Ago job requisition idJR26950 Job Title Senior QA Engineer About The Role Job Posting Title Senior QA Engineer Work Location Pune, India Relevant experience required (in years) 8 -1 0 years. Educational Qualification Bachelors degree in Computer Science , Engineering, or a related field (or equivalent experience). Your Position As a test engineer at? the Digital Service Factory, you are responsible for verifying and validating our digital product in one of ?Vanderlande market segment like Baggage Handling for Airports and Automated Logistics in Warehouses and Parcel distribution. You will be challenged technically and be responsible for a (risk based) test strategy so that you can guarantee a good quality and functionality. You will implement these test plans in an emulated environment, on the actual system in our Innovation Center, paying careful attention to the analysis of any problems that arise. You will report the problems and work together with the project team to deliver a prompt, high-quality solution. Required Skills & Competencies Technical Expertise Strong proficiency in Python and deep experience with test automation frameworks. BDD FrameworkCucumber Experience in Playwright test automation tool Ability to independently set up and maintain test automation frameworks from scratch. Experience with Databricks and Spark (a plus). Experience in testing data pipelines, time-series data, and AI systems. Familiarity with SQL and API testing tools. Cloud Knowledge Familiarity with cloud environments, particularly Azure . Experience in designing and executing tests for cloud-based systems with multiple services and steps. Leadership and Proactivity Demonstrated leadership in QA roles, with a proactive approach to identifying and addressing quality issues. Ability to work independently and drive initiatives across teams. AI Testing Focus and Interest Experience in testing AI models and systems, with a deep understanding of the unique challenges in validating AI components. A strong interest in AI and its applications, with a commitment to ensuring the quality and reliability of AI-driven products. Software Development Background Experience in software development, enabling effective collaboration with developers and early identification of potential quality issues. Communication and Problem-Solving Excellent communication skills to effectively collaborate with cross-functional teams and the ability to solve complex quality issues efficiently Certification (Preferred )ISTQB - AI Testing certification or equivalent, demonstrating expertise in AI testing methodologies. Roles & Responsibilities Quality Assurance Ownership Take responsibility for ensuring high-quality standards across our data pipelines, AI systems, and associated components. Establish and maintain quality benchmarks, ensuring they are met throughout the development lifecycle. Act as a key advocate for QA activities, representing the team in the QA guild, and promoting best practices across the department. Test Strategy Development and Automation Independently design, develop, and implement a comprehensive test strategy, with a strong focus on automation. Establish and maintain test automation frameworks tailored to the unique challenges of testing data pipelines, time-series data, and AI systems. AI System Testing Develop specialized testing methodologies and frameworks for validating AI models and systems. Ensure that AI components are rigorously tested for accuracy, performance, bias, and reliability. Automation and Tool Integration Lead the development and maintenance of automated test scripts, particularly for testing APIs, data pipelines, and AI-driven components. Ensure seamless integration of testing tools with our CI/CD pipelines and cloud environments, particularly on Azure. Proactive Collaboration and Support Work closely with cross-functional teams, including data scientists, AI engineers, data engineers, and software developers, to embed quality at every stage of the development process. Provide mentorship and training on best practices in unit testing, AI testing, and automation. Challenge the team on the architecture and design of the product from a quality perspective, ensuring that quality is considered from the early stages of development. Continuous Improvement Continuously evaluate and enhance QA methodologies, tools, and processes. Stay updated on the latest industry trends, particularly in AI, cloud technologies, and automation, to ensure our testing practices remain at the forefront of innovation. Defect Management and Reporting Identify , document, and manage defects with a proactive approach, ensuring timely resolution. Automate and maintain test reports to provide transparency on quality metrics and testing outcomes. Communication and Problem-Solving Exhibit excellent communication skills to effectively collaborate with diverse teams, and utilize strong problem-solving abilities to identify and resolve complex quality issues efficiently. About the Company Vanderlande Website Vanderlande is a market-leading, global partner for future-proof logistic process automation in the warehousing, airports and parcel sectors. Its extensive portfolio of integrated solutions innovative systems, intelligent software and life-cycle services results in the realization of fast, reliable and efficient automation technology. Established in 1949, Vanderlande has more than 9,000 employees, all committed to moving its Customers businesses forward at diverse locations on every continent. It has established a global reputation over the past seven decades as a highly reliable partner for future-proof logistic process automation. Vanderlande was acquired in 2017 by Toyota Industries Corporation, which will help it to continue its sustainable profitable growth. The two companies have a strong strategic match, and the synergies include cross-selling, product innovations, and research and development. Why should you join Vanderlande India Global Capability Center (GCC) We are certified as a Great Place to Work by the prestigious Great Place to Work Institute. Flexible and Hybrid Workplace. Vanderlande Academy and training facilities to boost your skills. Mediclaim benefit including parental coverage. On-site company health centers with a gym, employee wellbeing sessions, in house doctor support. A variety in Vanderlande Network communities and initiatives. Opportunity to collaborate globally. Being you @Vanderlande (Diversity statement) Vanderlande is an equal opportunity employer. Qualified applicants will be considered without regards to race, religion, color , national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.

Posted 1 month ago

Apply

6 - 11 years

6 - 16 Lacs

Bhopal, Hyderabad, Pune

Hybrid

Naukri logo

Urgent Opening for Sr/ Azure data lead Position!!!! Greening from NewVision Software!!! Exp : Min 6yrs CTC : As per company norms NP : Max 15 days Skills required : ADF, Databricks, SQL, Python JD : Job Description Position Summary: We are seeking a talented Sr. / Lead Data Engineer with a strong background in data engineering to join our team. You will play a key role in designing, building, and maintaining data pipelines using a variety of technologies, with a focus on the Microsoft Azure cloud platform. Responsibilities: Design, develop, and implement data pipelines using Azure Data Factory (ADF) or other orchestration tools. Write efficient SQL queries to extract, transform, and load (ETL) data from various sources into Azure Synapse Analytics. Utilize PySpark and Python for complex data processing tasks on large datasets within Azure Databricks. Collaborate with data analysts to understand data requirements and ensure data quality. Hands-on experience in designing and developing Datalakes and Warehouses Implement data governance practices to ensure data security and compliance. Monitor and maintain data pipelines for optimal performance and troubleshoot any issues. Develop and maintain unit tests for data pipeline code. Work collaboratively with other engineers and data professionals in an Agile development environment. Preferred Skills & Experience: Good knowledge of PySpark & working knowledge of Python Full stack Azure Data Engineering skills (Azure Data Factory, DataBricks and Synapse Analytics) Experience with large dataset handling Hands-on experience in designing and developing Datalakes and Warehouses

Posted 1 month ago

Apply

3 - 6 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

locationsIN - Bangaloreposted onPosted Today time left to applyEnd DateMay 22, 2025 (5 days left to apply) job requisition idR140300 Company Overview A.P. Moller - Maersk is an integrated container logistics company and member of the A.P. Moller Group. Connecting and simplifying trade to help our customers grow and thrive . With a dedicated team of over 95,000 employees, operating in 130 countries; we go all the way to enable global trade for a growing world . From the farm to your refrigerator, or the factory to your wardrobe, A.P. Moller - Maersk is developing solutions that meet customer needs from one end of the supply chain to the other. About the Team At Maersk, the Global Ocean Manifest team is at the heart of global trade compliance and automation. We build intelligent, high-scale systems that seamlessly integrate customs regulations across 100+ countries, ensuring smooth cross-border movement of cargo by ocean, rail, and other transport modes. Our mission is to digitally transform customs documentation, reducing friction, optimizing workflows, and automating compliance for a complex web of regulatory bodies, ports, and customs authorities. We deal with real-time data ingestion, document generation, regulatory rule engines, and multi-format data exchange while ensuring resilience and security at scale. Key Responsibilities Work with large, complex datasets and ensure efficient data processing and transformation. Collaborate with cross-functional teams to gather and understand data requirements. Ensure data quality, integrity, and security across all processes. Implement data validation, lineage, and governance strategies to ensure data accuracy and reliability. Build, optimize , and maintain ETL pipelines for structured and unstructured data , ensuring high throughput, low latency, and cost efficiency . Experience in building scalable, distributed data pipelines for processing real-time and historical data. Contribute to the architecture and design of data systems and solutions. Write and optimize SQL queries for data extraction, transformation, and loading (ETL). Advisory to Product Owners to identify and manage risks, debt, issues and opportunities for the technical improvement . Providing continuous improvement suggestions in internal code frameworks, best practices and guidelines . Contribute to engineering innovations that fuel Maersks vision and mission. Required Skills & Qualifications 4 + years of experience in data engineering or a related field. Strong problem-solving and analytical skills. E xperience on Java, Spring framework Experience in building data processing pipelines using Apache Flink and Spark. Experience in distributed data lake environments ( Dremio , Databricks, Google BigQuery , etc.) E xperience on Apache Kafka, Kafka Streams Experience working with databases. PostgreSQL preferred, with s olid experience in writin g and opti mizing SQL queries. Hands-on experience in cloud environments such as Azure Cloud (preferred), AWS, Google Cloud, etc. Experience with data warehousing and ETL processes . Experience in designing and integrating data APIs (REST/ GraphQL ) for real-time and batch processing. Knowledge on Great Expectations, Apache Atlas, or DataHub , would be a plus Knowledge on RBAC, encryption, GDPR compliance would be a plus Business skills Excellent communication and collaboration skills Ability to translate between technical language and business language, and communicate to different target groups Ability to understand complex design Possessing the ability to balance and find competing forces & opinions , within the development team Personal profile Fact based and result oriented Ability to work independently and guide the team Excellent verbal and written communication Maersk is committed to a diverse and inclusive workplace, and we embrace different styles of thinking. Maersk is an equal opportunities employer and welcomes applicants without regard to race, colour, gender, sex, age, religion, creed, national origin, ancestry, citizenship, marital status, sexual orientation, physical or mental disability, medical condition, pregnancy or parental leave, veteran status, gender identity, genetic information, or any other characteristic protected by applicable law. We will consider qualified applicants with criminal histories in a manner consistent with all legal requirements. We are happy to support your need for any adjustments during the application and hiring process. If you need special assistance or an accommodation to use our website, apply for a position, or to perform a job, please contact us by emailing . Maersk is committed to a diverse and inclusive workplace, and we embrace different styles of thinking. Maersk is an equal opportunities employer and welcomes applicants without regard to race, colour, gender, sex, age, religion, creed, national origin, ancestry, citizenship, marital status, sexual orientation, physical or mental disability, medical condition, pregnancy or parental leave, veteran status, gender identity, genetic information, or any other characteristic protected by applicable law. We will consider qualified applicants with criminal histories in a manner consistent with all legal requirements. We are happy to support your need for any adjustments during the application and hiring process. If you need special assistance or an accommodation to use our website, apply for a position, or to perform a job, please contact us by emailing .

Posted 1 month ago

Apply

8 - 13 years

15 - 20 Lacs

Pune

Hybrid

Naukri logo

Job Description : Strong experience on Python programming. Experience on Databricks. Experience on Database like SQL Perform database performance tuning and optimization. Databricks Platform Work with Databricks platform for big data processing and analytics. Develop and maintain ETL processes using Databricks notebooks. Implement and optimize data pipelines for data transformation and integration. Design, develop, test, and deploy high-performance and scalable data solutions using Python

Posted 1 month ago

Apply

5 - 8 years

9 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Wipro Limited (NYSEWIT, BSE507685, NSEWIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. About The Role Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ? Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements ? Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ? Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ? Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: DataBricks - Data Engineering. Experience5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 month ago

Apply

5 - 8 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Wipro Limited (NYSEWIT, BSE507685, NSEWIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. About The Role Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ? Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements ? Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ? Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ? Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Azure Data Factory. Experience5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies