Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
6.0 - 11.0 years
15 - 30 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 6 - 15 Yrs Location: Pan India Job Description: Candidate must be proficient in Databricks Understands where to obtain information needed to make the appropriate decisions Demonstrates ability to break down a problem to manageable pieces and implement effective timely solutions Identifies the problem versus the symptoms Manages problems that require involvement of others to solve Reaches sound decisions quickly Develops solutions to meet business needs that reflect a clear understanding of the objectives practices and procedures of the corporation department and business unit Roles Responsibilities Provides innovative and cost effective solution using databricks Optimizes the use of all available resources Develops solutions to meet business needs that reflect a clear understanding of the objectives practices and procedures of the corporation department and business unit Learn adapt quickly to new Technologies as per the business need Develop a team of Operations Excellence building tools and capabilities that the Development teams leverage to maintain high levels of performance scalability security and availability Skills The Candidate must have 710 yrs of experience in databricks delta lake Hands on experience on Azure Experience on Python scripting Relevant experience with ETL methods and with retrieving data from dimensional data models and data warehouses Strong experience with relational databases and data access methods especially SQL Knowledge of Azure architecture and design Interested can share your resume to sankarspstaffings@gmail.com with below inline details. Over All Exp : Relevant Exp : Current CTC : Expected CTC : Notice Period :
Posted 3 weeks ago
7.0 - 10.0 years
20 - 27 Lacs
Pune, Chennai, Coimbatore
Hybrid
Proficient Working Knowledge in: ADB, ADF, PySpark, and SQL 79 Years Old Reputed MNC Company
Posted 3 weeks ago
4.0 - 10.0 years
6 - 12 Lacs
Kozhikode
Work from Office
Job Description The objective of the proposed Kerala Solid Waste Management Project (KSWMP) is to strengthen the institutional and service delivery systems for SWM in Kerala KSWMP aims to adopt a sector-wide integrated value chain approach for enhancing the service delivery Roles & Responsibilities Job Description: Review and prepare municipal waste management plans with ULBS, Assess technology options for waste collection, transportation, processing, and disposal, Evaluate decentralized and regional models for integrated SWM, Identify priority areas for waste collection and develop action plans, segregation manuals, and SOPs, Conduct feasibility studies for collection centers, transfer stations, and landfill sites, Determine waste collection frequency and bin placement strategies, Develop guidelines for collection centers and transfer stations, Assess vehicle requirements and optimize waste transportation, Train ULBs on SWM vehicle tracking and monitoring, Contribute to monthly/quarterly reports and other project-related tasks, Experience: Bachelor's degree in civil/ mechanical Engineering preferably with master's degree in urban/regional/transportation planning, urban management, construction management or related discipline, About 10 years of experience in urban infrastructure projects with a focus on collection and transportation, related C&T planning activities, development of service delivery infrastructure in ULBs for biodegradable and nonbiodegradable waste collection, transportation, processing and treatment, recycling, collection, etc Experience or working in projects funded by World Bank, ADB will be preferred Experience handling similar projects at the local government level,
Posted 3 weeks ago
2 - 7 years
20 - 25 Lacs
Hyderabad
Work from Office
Overview At Pepsico were redefining operational excellence with a data-driven mindset, and our Global IT team is at the forefront of this transformation. Our technology teams leverage advanced analytics to deliver predictive insights, enhance operational efficiency, and create unmatched consumer and customer experiences. Our culture is guided by our core values which define our mission to excel in the marketplace and act with integrity in everything we do. Were creating value with every initiative while promoting a sustainable and socially impactful agenda. Responsibilities Key Areas Predictive AI-based Operations ServiceNow Now Assist at Service Desk and Digital Experience Descriptive Analytics and Insights generation on ServiceNow Data Azure Cloud, Data Architecture and Azure ML Services for Global Service Desk, IT Service Management (ITSM) and Global Workplace Leadership & Stakeholder Management Responsibilities Predictive Ops and IT Experience ManagementLeverage your extensive domain expertise in ServiceNow ITSM, Service Desk Management, and End User Experience Management to identify areas for improvement and opportunities for AI and Predictive IT Ops applications, and building capabilities and optimizing Workplace Efficiency Azure Machine LearningLead the exploration and identification of Predictive and Forecasting use cases specifically tailored to the ServiceNow platform, focusing on maximizing business impact and user adoption using Azure stack. Utilize Azure Machine Learning to develop and deploy predictive models, ensuring integration with Azure services and seamless operationalization. Product ManagementPrioritize and manage the Digital Brain (i.e. AI use cases) product backlog, ensuring the timely delivery of predictive models, features, and improvements. Oversee the release of high-quality predictive solutions that meet organizational goals. Leadership and ManagementPartner with the leadership to develop a strategic roadmap for applying AI and Predictive capabilities across ITSM, Service Desk, and Digital Experience functions leveraging ServiceNow data. Stakeholder CollaborationCollaborate extensively with stakeholders to understand pain points and opportunities, translating business needs into precise user stories and actionable tasks. Ensure clear communication and alignment between business objectives and technical implementation Lead other team members in the different digital projects acting as a data science lead for the project. Act as a subject matter expert across different digital projects. Act as stream leader in innovation activities Partner with product managers in taking DS requirements and assessing DS components in roadmaps. Partner with data engineers to ensure data access for discovery and proper data is prepared for model consumption. Lead ML engineers working on industrialization. Coordinate work activities with Business teams, other IT services and as required. Drive the use of the Platform toolset and to also focus on 'the art of the possible' demonstrations to the business as needed. Communicate with business stakeholders in the process of service design, training and knowledge transfer. Support large-scale experimentation and build data-driven models. Experience in cloud-based development and deployment (Azure preferred) Set KPIs and metrics to evaluate analytics solution given a particular use case. Refine requirements into modelling problems. Influence product teams through data-based recommendations. Research in state-of-the-art methodologies. Create documentation for learnings and knowledge transfer. Create reusable packages or libraries. Experience in leading contractors or other team members Qualifications EducationBachelors or Masters degree in computer science, Information Systems, or a related field. ExperienceExtensive experience (12+ years) in ITSM / Service Desk Transformation / IT Operations arena with exposure to predictive intelligence, data architecture, data modelling, and data engineering, with a focus on Azure cloud-based solutions. Technical Skills: Knowledge of Azure cloud services, including Azure Data Factory (ADF), Azure Data Lake Storage (ADLS), Azure Databricks (ADB), and Azure Machine Learning. Domain KnowledgeDeep understanding of ServiceNow modules, specifically ITSM related to incident, problem, request, change management coupled with extensive knowledge of predictive analytics, data science principles. Understanding and visibility into IT Operations and Support services including, Global Workplace Services like End user compute, Workplace management solutions, Unified Communications and Collaborations is an added advantage. Analytical Skills: Outstanding analytical and problem-solving skills to translate extensive business experience into highly effective predictive intelligence solutions. CommunicationExceptional communication and interpersonal skills, honed through years of collaboration with diverse stakeholders and vendors. MethodologiesExtensive experience with agile methodologies and a record of working in highly dynamic and agile development environments. Project ManagementProven ability to manage multiple projects concurrently, prioritizing tasks effectively to drive impactful results. LeadershipDemonstrated leadership and management capabilities, with a track record of guiding teams to achieve strategic goals and fostering a collaborative team environment. Strong Knowledge in Statistical/ML/AI techniques to solve supervised (regression, classification) and unsupervised problems, with focus on time series forecasting. Experiences with Deep Learning are a plus. Functional Knowledge at least one of these IT Service Management (ITSM) IT Service Desk ServiceNow (ITSM Module) Digital Workplace Services Technical Knowledge Azure Machine Learning (AML) Mandatory Azure Databricks (ADB) Mandatory Azure Data Factory (ADF) Optional Azure Data Lake Storage (ADLS) Optional Certifications at least one of these Azure Fundamentals (AI-900) Azure AI Engineer Azure Data Scientist ITIL Foundation or above
Posted 1 month ago
3 - 8 years
13 - 18 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Job Summary: We are looking for a skilled Azure Data Engineer to join our Data & Analytics team. You will be responsible for building and optimizing our data pipelines, designing and implementing data solutions on Microsoft Azure, and enabling data-driven decision-making across the organization. Key Responsibilities: Design, develop, and maintain scalable data pipelines and data processing systems using Azure Data Factory, Azure Synapse Analytics, and Azure Databricks. Integrate data from various structured and unstructured data sources into a centralized data lake or data warehouse. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver high-quality data solutions. Optimize data flows for performance, reliability, and scalability. Implement and manage data security, privacy, and compliance policies. Monitor and troubleshoot data pipeline issues and ensure system reliability. Leverage DevOps practices for CI/CD pipelines using tools like Azure DevOps or GitHub Actions. Document data flows, architecture, and data models. Required Skills & Qualifications: Bachelors or Master’s degree in Computer Science, Information Systems, or a related field. 4+ years of experience in data engineering or a similar role. Hands-on experience with Azure services such as: Azure Data Factory Azure Synapse Analytics Azure Data Lake Storage Gen2 Azure SQL Database or Azure SQL Managed Instance Azure Databricks (preferred) Proficiency in SQL, Python, and/or PySpark for data transformation. Experience with data modeling, ETL/ELT processes, and data integration. Strong understanding of data governance, security, and compliance in the cloud. Familiarity with version control systems (e.g., Git) and CI/CD practices. Preferred Qualifications: Microsoft Certified: Azure Data Engineer Associate or equivalent certification. Experience with real-time data processing using Azure Stream Analytics or Apache Kafka. Knowledge of Power BI and data visualization best practices. Experience working in Agile or Scrum development environments.
Posted 1 month ago
6 - 8 years
8 - 12 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Skill Set - Azure Data Engineer (ADF, ADB), Python Work Type - Contract on third party payroll, Hybrid Location - Bangalore, Chennai, Hyderabad Notice Period - Immediate Joiner
Posted 1 month ago
5 - 9 years
22 - 32 Lacs
Noida, Kolkata, Hyderabad
Hybrid
Good Experience in: Hadoop, SQL, Azure (ADF, ADB, ADLS, Log Analytics, Logic App, Key Vault, Blob Storage) 79 Years Old Reputed MNC Company
Posted 1 month ago
12 - 15 years
15 - 17 Lacs
Bengaluru
Work from Office
About The Role Overview Technology for today and tomorrow The Boeing India Engineering & Technology Center (BIETC) is a 5500+ engineering workforce that contributes to global aerospace growth. Our engineers deliver cutting-edge R&D, innovation, and high-quality engineering work in global markets, and leverage new-age technologies such as AI/ML, IIoT, Cloud, Model-Based Engineering, and Additive Manufacturing, shaping the future of aerospace. People-driven culture At Boeing, we believe creativity and innovation thrives when every employee is trusted, empowered, and has the flexibility to choose, grow, learn, and explore. We offer variable arrangements depending upon business and customer needs, and professional pursuits that offer greater flexibility in the way our people work. We also believe that collaboration, frequent team engagements, and face-to-face meetings bring together different perspectives and thoughts enabling every voice to be heard and every perspective to be respected. No matter where or how our teammates work, we are committed to positively shaping peoples careers and being thoughtful about employee wellbeing. Boeing India Software Engineering team is currently looking for one Lead Software Engineer Developer to join their team in Bengaluru, KA. As a ETL Developer , you will be part of the Application Solutions team, which develops software applications and Digital products that create direct value to its customers. We provide re-vamped work environments focused on delivering data-driven solutions at a rapidly increased pace over traditional development. Be a part of our passionate and motivated team who are excited to use the latest in software technologies for modern web and mobile application development. Through our products we deliver innovative solutions to our global customer base at an accelerated pace. Position Responsibilities: Perform data mining and collection procedures. Ensure data quality and integrity, Interpret and analyze data problems. Visualize data and create reports. Experiments with new models and techniques Determines how data can be used to achieve customer / user goals. Designs data modeling processes Create algorithms and predictive models to for analysis. Enables development of prediction engines, pattern detection analysis, and optimization algorithms, etc. Develops guidance for analytics-based wireframes. Organizes and conducts data assessments. Discovers insights from structured and unstructured data. Estimate user stories/features (story point estimation) and tasks in hours with the required level of accuracy and commit them as part of Sprint Planning. Contributes to the backlog grooming meetings by promptly asking relevant questions to ensure requirements achieve the right level of DOR. Raise any impediments/risks (technical/operational/personal) they come across and approaches Scrum Master/Technical Architect/PO accordingly to arrive at a solution. Update the status and the remaining efforts for their tasks on a daily basis. Ensures change requests are treated correctly and tracked in the system, impact analysis done, and risks/timelines are appropriately communicated. Hands-on experience in understanding aerospace domain specific data Must coordinate with data scientists in data preparation, exploration and making data ready. Must have clear understanding of defining data products and monetizing. Must have experience in building self-service capabilities to users. Build quality checks across the data lineage and responsible in designing and implementing different data patterns. Can influence different stakeholders for funding and building the vision of the product in terms of usage, productivity, and scalability of the solutions. Build impactful or outcome-based solutions/products. Basic Qualifications (Required Skills/Experience): Bachelors or masters degree as BASIC QUALIFICATION 12-15 years of experience as a data engineer. Expertise in SQL, Python, Knowledge of Java, Oracle, R, Data modeling, Power BI. Experience in understanding and interacting with multiple data formats. Ability to rapidly learn and understand software from source code. Expertise in understanding, analyzing & optimizing large, complicated SQL statements Strong knowledge and experience in SQL Server, database design and ETL queries. Develop software models to simulate real world problems to help operational leaders understand on which variables to focus. Candidate should have proficiency to streamline and optimize databases for efficient and consistent data consumption. Strong understanding of Datawarehouse concepts, data lake, data mesh Familiar with ETL tools and Data ingestion patterns Hands on experience in building data pipelines using GCP. Hands on experience in writing complex SQL (No- SQL is a big plus) Hands on experience with data pipeline orchestration tools such as Airflow/GCP Composer Hands on experience on Data Modelling Experience in leading teams with diversity Experience in performance tuning of large datawarehouse/datalakes. Exposure to prompt engineering, LLMs, and vector DB. Python, SQL and Pyspark Spark Ecosystem (Spark Core, Spark Streaming, Spark SQL) / Databricks Azure (ADF, ADB, Logic Apps, Azure SQL database, Azure Key Vaults, ADLS, Synapse) Preferred Qualifications [Required Skills/Experience] PubSUB, Terraform Deep Learning - Tensor flow Time series, BI/Visualization Tools - Power BI and Tablaeu, Languages - R/Phython Deep Learning - Tensor flow Machine Learning NLP Typical Education & Experience Education/experience typically acquired through advanced education (e.g. Bachelor) and typically 12 to 15 years' related work experience or an equivalent combination of education and experience (e.g. Master+11 years of related work experience etc.) Relocation This position does offer relocation within INDIA. Export Control Requirements This is not an Export Control position. Education Bachelor's Degree or Equivalent Required Relocation This position offers relocation based on candidate eligibility. Visa Sponsorship Employer will not sponsor applicants for employment visa status. Shift Not a Shift Worker (India)
Posted 1 month ago
5 - 10 years
0 Lacs
Chennai, Coimbatore, Bengaluru
Hybrid
Open & Direct Walk-in Drive event | Hexaware technologies - Azure Data Engineer/Architect in Chennai, Tamilnadu on 10th May [Saturday] 2025 - Azure Databricks/ Data factory/ SQL & Pyspark Dear Candidate, I hope this email finds you well. We are thrilled to announce an exciting opportunity for talented professionals like yourself to join our team as an Azure Data Engineer. We are hosting an Open Walk-in Drive in Chennai, Tamilnadu on 10th May [Saturday] 2025, and we believe your skills in Databricks, Data Factory, SQL, and Pyspark align perfectly with what we are seeking. Details of the Walk-in Drive: Date: 10th May [Saturday] 2025 Experience: 5 years to 12 years Time: 9.00 AM to 5 PM Venue: HEXAWARE TECHNOLOGIES H-5, Sipcot It Park, Post, Navalur, Siruseri, Tamil Nadu 603103 Point of Contact: Azhagu Kumaran Mohan/+91-9789518386 Key Skills and Experience: As an Azure Data Engineer, we are looking for candidates who possess expertise in the following: Databricks Data Factory SQL Pyspark/Spark Roles and Responsibilities: As a part of our dynamic team, you will be responsible for: Designing, implementing, and maintaining data pipelines Collaborating with cross-functional teams to understand data requirements. Optimizing and troubleshooting data processes Leveraging Azure data services to build scalable solutions. What to Bring: Updated resume Photo ID, Passport size photo How to Register: To express your interest and confirm your participation, please reply to this email with your updated resume attached. Walk-ins are also welcome on the day of the event. This is an excellent opportunity to showcase your skills, network with industry professionals, and explore the exciting possibilities that await you at Hexaware Technologies. If you have any questions or require further information, please feel free to reach out to me at AzhaguK@hexaware.com - +91-9789518386 We look forward to meeting you and exploring the potential of having you as a valuable member of our team. ********* less than 4 years of total experience will not be Screen selected to attend the interview***********
Posted 1 month ago
12 - 22 years
35 - 65 Lacs
Chennai
Hybrid
Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 8 - 24 Yrs Location- Pan India Job Description : - Candidates should have minimum 2 Years hands on experience as Azure databricks Architect If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in With Regards, Sankar G Sr. Executive - IT Recruitment
Posted 1 month ago
10 - 18 years
35 - 55 Lacs
Hyderabad, Bengaluru, Mumbai (All Areas)
Hybrid
Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 8 Yrs - 18 Yrs Location- Pan India Job Description : - Experience in Synapase with pyspark Knowledge of Big Data pipelinesData Engineering Working Knowledge on MSBI stack on Azure Working Knowledge on Azure Data factory Azure Data Lake and Azure Data lake storage Handson in Visualization like PowerBI Implement endend data pipelines using cosmosAzure Data factory Should have good analytical thinking and Problem solving Good communication and coordination skills Able to work as Individual contributor Requirement Analysis CreateMaintain and Enhance Big Data Pipeline Daily status reporting interacting with Leads Version controlADOGIT CICD Marketing Campaign experiences Data Platform Product telemetry Analytical thinking Data Validation of the new streams Data quality check of the new streams Monitoring of data pipeline created in Azure Data factory updating the Tech spec and wiki page for each implementation of pipeline Updating ADO on daily basis If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in With Regards, Sankar G Sr. Executive - IT Recruitment
Posted 1 month ago
10 - 20 years
35 - 55 Lacs
Hyderabad, Bengaluru, Mumbai (All Areas)
Hybrid
Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 8 Yrs - 18 Yrs Location- Pan India Job Description : - Mandatory Skill: Azure ADB with Azure Data Lake Lead the architecture design and implementation of advanced analytics solutions using Azure Databricks Fabric The ideal candidate will have a deep understanding of big data technologies data engineering and cloud computing with a strong focus on Azure Databricks along with Strong SQL Work closely with business stakeholders and other IT teams to understand requirements and deliver effective solutions Oversee the endtoend implementation of data solutions ensuring alignment with business requirements and best practices Lead the development of data pipelines and ETL processes using Azure Databricks PySpark and other relevant tools Integrate Azure Databricks with other Azure services eg Azure Data Lake Azure Synapse Azure Data Factory and onpremise systems Provide technical leadership and mentorship to the data engineering team fostering a culture of continuous learning and improvement Ensure proper documentation of architecture processes and data flows while ensuring compliance with security and governance standards Ensure best practices are followed in terms of code quality data security and scalability Stay updated with the latest developments in Databricks and associated technologies to drive innovation Essential Skills Strong experience with Azure Databricks including cluster management notebook development and Delta Lake Proficiency in big data technologies eg Hadoop Spark and data processing frameworks eg PySpark Deep understanding of Azure services like Azure Data Lake Azure Synapse and Azure Data Factory Experience with ETLELT processes data warehousing and building data lakes Strong SQL skills and familiarity with NoSQL databases Experience with CICD pipelines and version control systems like Git Knowledge of cloud security best practices Soft Skills Excellent communication skills with the ability to explain complex technical concepts to nontechnical stakeholders Strong problemsolving skills and a proactive approach to identifying and resolving issues Leadership skills with the ability to manage and mentor a team of data engineers Experience Demonstrated expertise of 8 years in developing data ingestion and transformation pipelines using DatabricksSynapse notebooks and Azure Data Factory Solid understanding and handson experience with Delta tables Delta Lake and Azure Data Lake Storage Gen2 Experience in efficiently using Auto Loader and Delta Live tables for seamless data ingestion and transformation Proficiency in building and optimizing query layers using Databricks SQL Demonstrated experience integrating Databricks with Azure Synapse ADLS Gen2 and Power BI for endtoend analytics solutions Prior experience in developing optimizing and deploying Power BI reports Familiarity with modern CICD practices especially in the context of Databricks and cloudnative solutions If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in With Regards, Sankar G Sr. Executive - IT Recruitment
Posted 1 month ago
11 - 20 years
20 - 35 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 11 - 20 Yrs Location- Pan India Job Description : - Minimum 2 Years hands on experience in Solution Architect ( AWS Databricks ) If interested please forward your updated resume to sankarspstaffings@gmail.com With Regards, Sankar G Sr. Executive - IT Recruitment
Posted 1 month ago
5 - 10 years
11 - 21 Lacs
Hyderabad
Remote
Job Location : Hyderabad / Bangalore / Chennai / Kolkata / Noida/ Gurgaon / Pune / Indore / Mumbai Preferred: Hyderabad At least 5+ years of relevant hands on development experience as Azure Data Engineering role Proficient in Azure technologies like ADB, ADF, SQL(capability of writing complex SQL queries), ADB, PySpark, Python, Synapse, Delta Tables, Unity Catalog Hands on in Python, PySpark or Spark SQL Hands on in Azure Analytics and DevOps Taking part in Proof of Concepts (POCs) and pilot solutions preparation Ability to conduct data profiling, cataloguing, and mapping for technical design and construction of technical data flows Experience in business processing mapping of data and analytics solutions
Posted 1 month ago
4 - 9 years
5 - 10 Lacs
Chennai, Pune, Bengaluru
Hybrid
Azure Devops 4+Yrs Bangalore/Chennai/Pune Skills: Azure devops, Terraform ,Kubernetes,Ci/CD, ADF/ADB,Infrastructure Cloud Excellent Communication skills Share cv to Yogitha@ontimesolutions.in or 7406353337
Posted 2 months ago
6 - 11 years
8 - 13 Lacs
Bengaluru
Work from Office
Data Engineer with ADF ADB (MRF00825) - J48819 ROLE: Data Engineer Location: Anywhere in India-Work from home All the below skills/exp are mandatory Total exp: Exp in/as Data Engineer: Exp in Azure Data Factory: Exp in Azure Data Bricks: Exp in PowerBI: Exp in PySpark: Exp in Python: Required Candidate profile Candidate Experience Should Be : 6 To 15 Candidate Degree Should Be : BE-Comp/IT,BE-Other,BTech-Comp/IT,BTech-Other,MCA,MCS,ME-Comp/IT,ME-Other,MIS,MIT,MSc-Comp/IT,MS-Comp/IT,MSc-Other,MS-Other,MTech-Comp/IT,MTech-Other
Posted 2 months ago
7 - 12 years
15 - 30 Lacs
Bengaluru
Remote
Lead / Senior Azure Data Engineer Job Location : Hyderabad / Bangalore / Chennai / Noida/ Gurgaon / Pune / Indore / Mumbai / Kolkata Responsibilities: • At least 7+ years experience as Data Engineeing role for analytical projects, preferably on Microsoft Azure Proficient in Azure technologies like ADF, Azure Synapse, Databricks, Analysis services Proficient in Azure Tables, Cache, SQL Server, Azure AD. Solid understanding of cloud security, leveraging Windows operating systems, Active Directory, Federated AD with Market leading SSO solutions. Knowledge in Python, PySpark or Spark SQL Experience in Azure Analytics and DevOps Preparing Requirements Analysis and Data Architecture Design Designing and delivering Azure Data Analytics solutions Providing Azure technological vision in the project implementation for analytical projects Taking part in Proof of Concepts (POCs) and pilot solutions preparation Lead the Implementation with responsibility for the delivery of the Architecture designs and Data flow strategy Experience with preparing data for Data Science and Machine Learning purposes Ability to conduct data profiling, cataloguing, and mapping for technical design and construction of technical data flows Experience in business processing mapping of data and analytics solutions
Posted 2 months ago
8 - 10 years
19 - 30 Lacs
Chennai, Bengaluru, Mumbai (All Areas)
Work from Office
Azure Databricks, Pyspark, Azure Data Factory
Posted 2 months ago
2 - 4 years
7 - 10 Lacs
Chennai, Bengaluru, Hyderabad
Work from Office
APT Test Generic Location: Bangalore, Hyderabad, Chennai, Pune Skills/Experience: Good understanding Test case development, test planning, log analysis, Android Arch, Android tools like ADB, Fastboot, CTS, VTS, GTS and Python or Shell Domain: APT Testing Experience (years) : 2 to 4 years
Posted 2 months ago
4 - 9 years
7 - 17 Lacs
Chennai, Pune, Bengaluru
Work from Office
Role & responsibilities **Big Data Lead** - Must-Have Skills: Pyspark, Databricks, SQL - Nice-to-Have Skills: ADF - Experience: 4-12 years - Work Locations: Bangalore/Chennai/Pune - Notice Period: Immediate - 30 days
Posted 2 months ago
5 - 10 years
20 - 22 Lacs
Nasik, Pune
Work from Office
Data Engineer–Azure Synapse & Data Pipelines Expert Experience Exp: 5 to 8 Years Location: Nashik/Pune (Work from office only) Design, develop, and maintain efficient data pipelines using Azure Synapse, Azure Data Factory(ADF)& Azure Databricks(ADB) Required Candidate profile experience with Azure Synapse Analytics, ADF, ADB, PySpark, and SQL Proven expertise in designing and optimizing complex data pipelines for high performance and reliability Experience with Data Lake
Posted 2 months ago
5 - 10 years
16 - 27 Lacs
Bengaluru
Work from Office
We have 2 requirements : 1 - Azure Engineer Specialist Mandate Skills - ADB, Pyspark , ADF , Delta Lake 2 - Azure Lead Mandate Skills - ADF, ADB , Syapse , Python , Erwin Also looking for someone who is handling experience in team leading. Data Engineer utilizes software engineering principles to deploy and maintain fully automated data transformation pipelines that combine a large variety of storage and computation technologies to handle the distribution of data types and volumes in support of data architecture design. A Senior Data Engineer designs and oversees the entire data infrastructure, data products, and data pipelines that are resilient to change, modular, flexible, scalable, reusable, and cost-effective. Key Responsibilities: Design and oversee the entire data architecture strategy. Mentor junior data architects to ensure skill development in alignment with the team strategy. Design and implement complex scalable, high-performance data architectures that meet business requirements. Model data for optimal reuse, interoperability, security, and accessibility. Develop and maintain data flow diagrams and data dictionaries. Collaborate with stakeholders to understand data needs and translate them into technical solutions. Ensure data accessibility through a performant, cost-effective consumption layer that supports use by citizen developers, data scientists, AI, and application integration. Ensure data quality, integrity, and security across all data systems. Qualification: Experience in Erwin, Azure Synapse, Azure Databricks, Azure DevOps, SQL, Power BI, Spark, Python, and R. Ability to drive business results by building optimal cost data landscapes. Familiarity with Azure AI/ML Services, Azure Analytics: Event Hub, Azure Stream Analytics, Scripting: Ansible Experience with machine learning and advanced analytics. Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes). Understanding of CI/CD pipelines and automated testing frameworks. Certifications such as AWS Certified Solutions Architect, IBM certified data architect, or similar are a plus.
Posted 2 months ago
6 - 10 years
8 - 12 Lacs
Bengaluru
Work from Office
Our Team The SAP ABAP Developer - Specialist will be a part of the Business Segments Delivery Team for Koch Industries. Koch Industries is a privately held global organization with over 120,000 employees around the world, with subsidiaries involved in manufacturing, trading, and investments. KGS, India is being developed to extend its IT operations, as well as act as a hub for innovation in the IT function. As KGS rapidly scales up its operations in India, its employees will get opportunities to carve out a career path for themselves within the organization. This role will have the opportunity to join on the ground floor and will play a critical part in helping build out the KGS over the next several years. Working closely with global colleagues would provide significant global exposure to the employees. This role is a part of the Georgia-Pacific team within the KGS. GP completely owned by Koch Industries. Your Job SAP ABAP Developer - Specialist will report to the SAP Development Lead of the KGS and will be a part of an international team that designs, develops, and delivers new applications for Koch Industries. KGS, India is being developed to extend its IT operations, as well as act as a hub for innovation in the IT function. What You Will Need To Bring With You: Providing on-shift and on-call support for the NACP business systems applications On-call rotation for 24x7 system support Working with functional and technical teams that are both India and US based. Collaborating with various teams such as infrastructure/basis support, integration developers and business application developers Completing key project work and support activities Adopting best practices in the implementation and execution of processes Translating functional specifications into technical design specifications Developing enhancements, forms, workflows, or interfaces to meet business requirements. Evaluating technical solutions, providing alternatives, and recommending an approach to solve a problem. Executing technical unit test scenarios Using your technical and process knowledge to come up to speed on new technologies and tools required for SAP development and support. Challenging the status quo and focusing on long term value when designing solutions Troubleshooting and resolving issues within the system Articulating complex concepts and ideas to functional teams Maintaining efficient and reusable code Lead team code review sessions and provide coding best practices to enhance developer knowledge and experience. Who You Are (Basic Qualifications) Minimum of 6 years experience in SAP application development utilizing ABAP & ABAP on HANA Minimum 3 years experience in development tools such as UI5, Fiori, OData Services Development experience with HANA AMDP, CDS, ADB, BTP Services, CAP/RAP Programming. Experience in RICEFW objects Experience developing User exits, BADIs, BTEs, enhancement points. Experience in interfaces EDI/ALE/IDOCs, RFC, BAPIs, Forms Experience in SAP Workflow Experience with the switch Framework Experience developing in the SAP FI/CO, new GL, Order to Cash, Extended WM, Procure to Pay, Plaint Maintenance and Supply Chain modules. Extensive experience in debugging SAP code Experience with performance tuning, runtime analysis and system monitoring Experience using ABAP objects and controls technology. Strong written and oral communication skills Strong analytical and problem-solving skills Ability to work independently, as well as a collaborative team environment. Should be detail oriented. What Will Put You Ahead: Experience with ABAP eclipse tool for development SAP Data Archiving Hands on experience in SAP Solution Manager Experience developing extensions on SAP Cloud Platform Experience developing for mobile equipment (e.g., tablets, RF devices, vehicle mounted devices) Must be willing to carry a company-provided smart phone, have home high-speed Internet, and may participate in a 12x7 On-Call on rotational basis. On call support only for P1 on rotational basis, once in 4 to 6 weeks (6 AM to 6 PM IST), occasional weekend support (night shifts)
Posted 2 months ago
6 - 8 years
27 - 42 Lacs
Bengaluru
Work from Office
Experience in working on Azure Services like Azure Data Factory, Azure Function, Azure SQL, Azure Data Bricks, Azure Data Lake, Synapse Analytics etc.Strong SQL, data modeling, Agile exp. Mentor teams, deploy platforms. Big Data & ML knowledge a plus
Posted 2 months ago
6 - 8 years
8 - 12 Lacs
Bengaluru
Work from Office
Our Team The SAP ABAP Developer - Specialist will be a part of the Business Segments Delivery Team for Koch Industries. Koch Industries is a privately held global organization with over 120,000 employees around the world, with subsidiaries involved in manufacturing, trading, and investments. KGS, India is being developed to extend its IT operations, as well as act as a hub for innovation in the IT function. As KGS rapidly scales up its operations in India, its employees will get opportunities to carve out a career path for themselves within the organization. This role will have the opportunity to join on the ground floor and will play a critical part in helping build out the KGS over the next several years. Working closely with global colleagues would provide significant global exposure to the employees. This role is a part of the Georgia-Pacific team within the KGS. GP completely owned by Koch Industries. Your Job SAP ABAP Developer - Specialist will report to the SAP Development Lead of the KGS and will be a part of an international team that designs, develops, and delivers new applications for Koch Industries. KGS, India is being developed to extend its IT operations, as well as act as a hub for innovation in the IT function. What You Will Need To Bring With You: Providing on-shift and on-call support for the NACP business systems applications On-call rotation for 24x7 system support Working with functional and technical teams that are both India and US based. Collaborating with various teams such as infrastructure/basis support, integration developers and business application developers Completing key project work and support activities Adopting best practices in the implementation and execution of processes Translating functional specifications into technical design specifications Developing enhancements, forms, workflows, or interfaces to meet business requirements. Evaluating technical solutions, providing alternatives, and recommending an approach to solve a problem. Executing technical unit test scenarios Using your technical and process knowledge to come up to speed on new technologies and tools required for SAP development and support. Challenging the status quo and focusing on long term value when designing solutions Troubleshooting and resolving issues within the system Articulating complex concepts and ideas to functional teams Maintaining efficient and reusable code Lead team code review sessions and provide coding best practices to enhance developer knowledge and experience. Who You Are (Basic Qualifications) Minimum of 6 years experience in SAP application development utilizing ABAP & ABAP on HANA Minimum 3 years experience in development tools such as UI5, Fiori, OData Services Development experience with HANA AMDP, CDS, ADB, BTP Services, CAP/RAP Programming. Experience in RICEFW objects Experience developing User exits, BADIs, BTEs, enhancement points. Experience in interfaces EDI/ALE/IDOCs, RFC, BAPIs, Forms Experience in SAP Workflow Experience with the switch Framework Experience developing in the SAP FI/CO, new GL, Order to Cash, Extended WM, Procure to Pay, Plaint Maintenance and Supply Chain modules. Extensive experience in debugging SAP code Experience with performance tuning, runtime analysis and system monitoring Experience using ABAP objects and controls technology. Strong written and oral communication skills Strong analytical and problem-solving skills Ability to work independently, as well as a collaborative team environment. Should be detail oriented. What Will Put You Ahead: Experience with ABAP eclipse tool for development SAP Data Archiving Hands on experience in SAP Solution Manager Experience developing extensions on SAP Cloud Platform Experience developing for mobile equipment (e.g., tablets, RF devices, vehicle mounted devices) Must be willing to carry a company-provided smart phone, have home high-speed Internet, and may participate in a 12x7 On-Call on rotational basis. On call support only for P1 on rotational basis, once in 4 to 6 weeks (6 AM to 6 PM IST), occasional weekend support (night shifts)
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2