Jobs
Interviews

8586 Data Modeling Jobs - Page 25

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 6.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Proven expertise in leading the design and implementation of scalable solutions within Salesforce Marketing Cloud. EsyCommerce is seeking a highly experienced and skilled SFMC (Salesforce Marketing Cloud) Technical Lead for a full-time position in Bangalore. You will be responsible for leading the design and implementation of scalable and effective SFMC solutions for our clients. This role requires a strong understanding of marketing automation best practices, data modeling, audience segmentation, and automation strategies within the SFMC platform. You will work closely with cross-functional teams, provide technical leadership to SFMC developers, and ensure the successful delivery of high-quality marketing automation solutions.

Posted 1 week ago

Apply

0.0 - 6.0 years

5 - 6 Lacs

Hyderabad, Ahmedabad, Bengaluru

Work from Office

Job Title Associate Analyst Job Description Support the revenue, programmatic and native operations teams on all reporting and analysis tasks in the department. Primary responsibilities: Assist teams and our syndication partners by providing pacing reports daily which help them better optimize delivery and hit their delivery goals on time. Create and distribute reporting on key performance indicators based on the cadence agreed upon between the teams and provide business insights on all KPIs. Set up programmatic deals in our exchange partners user interface. Extract, transform and blend data from multiple sources to create datasets on which models are built and research is conducted. Mandatory Skills: Microsoft Excel Data Modeling Data Visualization Good Communication Good to Have: Knowledge of Online Advertising and Digital Media functions Understanding of SQL Designation : Associate Analyst Working Hours : 1 PM 10 PM IST Work Location : Ecoworld, Bengaluru It is the policy of Dotdash Meredith to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, the Company will provide reasonable accommodations for qualified individuals with disabilities. #INDIA#

Posted 1 week ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Bengaluru

Work from Office

About this opportunity: Ericsson s Automation Chapter is seeking a highly motivated and self-driven Data Engineer and Senior Data Engineer with strong expertise in SAP HANA and SAP BODS. The ideal candidates will be focused on SAP-centric development and integration, ensuring that enterprise data flows are robust, scalable, and optimized for analytics consumption. You will collaborate with a high-performing team that builds and supports end-to-end data solutions aligned with our SAP ecosystem. You are adaptable and a flexible problem-solver with deep hands-on experience in HANA modeling and ETL workflows, capable of switching contexts across a range of projects with varying scale and complexity. What you bring : Design, develop, and optimize SAP HANA objects such as Calculation Views, SQL Procedures, and Custom Functions. Develop robust and reusable ETL pipelines using SAP BODS for both SAP and third-party system integration. Enable seamless data flow between SAP ECC and external platforms, ensuring accuracy and performance. Collaborate with business analysts, architects, and integration specialists to translate requirements into technical deliverables. Tune and troubleshoot HANA and BODS jobs for performance, scalability, and maintainability. Ensure compliance with enterprise data governance, lineage, and documentation standards. Support ongoing enhancements, production issues, and business-critical data deliveries. Experience 8+ years of experience in SAP data engineering roles. Strong hands-on experience in SAP HANA (native development, modeling, SQL scripting). Proficient in SAP BODS, including job development, data flows, and integration techniques. Experience working with SAP ECC data structures, IDOCs, and remote function calls. Knowledge of data warehouse concepts, data modeling, and performance optimization techniques. Strong debugging and analytical skills, with the ability to independently drive technical solutions. Familiarity with version control tools and SDLC processes. Excellent communication skills and ability to work collaboratively in cross-functional teams. Education Bachelor s degree in computer science, Information Systems, Electronics & Communication, or a related field. Primary country and city: India (IN) || Bangalore Req ID: 770551

Posted 1 week ago

Apply

4.0 - 6.0 years

7 - 10 Lacs

Noida

Work from Office

Not Applicable Specialism Data, Analytics & AI Management Level Associate & Summary . In data analysis at PwC, you will focus on utilising advanced analytical techniques to extract insights from large datasets and drive datadriven decisionmaking. You will leverage skills in data manipulation, visualisation, and statistical modelling to support clients in solving complex business problems. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . Responsibilities Design and develop deep learning models for computer vision tasks such as Object detection (e.g., YOLO, Faster RCNN) Image classification (e.g., ResNet, EfficientNet) Semantic/instance segmentation (e.g., UNet, Mask RCNN) Video frame analysis, tracking, or scene understanding Use OpenCV for image/video preprocessing and classical vision tasks Filtering, contour detection, edge detection, motion tracking, image transformations Prepare and manage largescale image/video datasets frame extraction, annotation support, augmentation Evaluate models using appropriate metrics (IoU, mAP, F1, precisionrecall curves) Finetune or build from scratch using frameworks like PyTorch or TensorFlow Collaborate with crossfunctional teams to translate requirements into deployable models Write clean, modular, and welldocumented Python code for training and experimentation MustHave Skills 4 6 years of experience in applied deep learning roles with focus on computer vision Proficiency in Python with strong skills in data structures, vectorization, and modular coding Handson experience with OpenCV for traditional image/video processing tasks Strong understanding of CNN architectures and common vision model types Deep learning experience with PyTorch or TensorFlow Practical experience with vision datasets (e.g., COCO, Pascal VOC, custom video/image data) Familiarity with model training, loss functions, optimization techniques, and performance tuning NicetoHave Skills Experience with video analytics or multiobject tracking Familiarity with Albumentations, imgaug, or other augmentation libraries Exposure to transformerbased vision models (e.g., ViT, Swin) Basic understanding of explainability techniques for vision models (e.g., GradCAM) Experience working with edge devices, embedded systems, or realtime CV (optional) Mandatory skill sets Python, Open CV, Deep learning, GenAI ,machine learning, data science Preferred skill sets Data analysis, SQL, MLOPS Years of experience required 5+ Education qualification BE/B.Tech/MBA/MCA Education Degrees/Field of Study required Bachelor of Engineering, Master of Business Administration Degrees/Field of Study preferred Required Skills Structured Query Language (SQL) Accepting Feedback, Accepting Feedback, Active Listening, Algorithm Development, Alteryx (Automation Platform), Analytic Research, Big Data, Business Data Analytics, Communication, Complex Data Analysis, Conducting Research, Customer Analysis, Customer Needs Analysis, Dashboard Creation, Data Analysis, Data Analysis Software, Data Collection, DataDriven Insights, Data Integration, Data Integrity, Data Mining, Data Modeling, Data Pipeline, Data Preprocessing, Data Quality {+ 33 more} No

Posted 1 week ago

Apply

7.0 - 12.0 years

12 - 14 Lacs

Bengaluru

Work from Office

Jul 25, 2025 Location: Bengaluru Designation: Senior Consultant Entity: Deloitte Touche Tohmatsu India LLP Y our potential, unleashed. India s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realize your potential amongst cutting edge leaders, and organizations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The Team Deloitte s Technology & Transformation practice can help you uncover and unlock the value buried deep inside vast amounts of data. Our global network provides strategic guidance and implementation services to help companies manage data from disparate sources and convert it into accurate, actionable information that can support fact-driven decision-making and generate an insight-driven advantage. Our practice addresses the continuum of opportunities in business intelligence & visualization, data management, performance management and next-generation analytics and technologies, including big data, cloud, cognitive and machine learning. Your work profile: We are looking for an experienced ETL and Data Migration Tester to join our QA team. The ideal candidate will have strong experience in testing complex data pipelines, validating data transformations, and supporting large-scale data migration projects across cloud or on-premise platforms. Key Responsibilities: Analyze ETL mappings, business rules, and data flow diagrams to design test strategies. Develop and execute test cases and SQL queries to validate data transformation logic, data quality, and completeness. Perform source-to-target data validation and reconciliation between legacy and target systems. Validate data migration from legacy systems to data warehouses or cloud platforms (e.g., Azure, AWS, GCP). Collaborate with ETL developers, business analysts, and data architects to ensure data accuracy and integrity. Conduct smoke testing, system testing, regression testing, and UAT support . Work with data profiling and data quality tools to identify anomalies. Log, track, and close defects using tools like JIRA, ALM, or Azure DevOps. Participate in Agile/Scrum ceremonies and contribute to sprint goals and QA metrics. Required Skills: 7+ years of QA experience with at least 4+ years in ETL testing and data migration projects . Proficient in writing complex SQL queries for data validation (joins, aggregations, subqueries). Strong understanding of ETL tools such as Informatica, Talend, SSIS , or equivalent. Experience in validating large-scale data migration projects (on-prem to cloud or system upgrades). Hands-on experience in testing data warehouse and data lake environments. Familiarity with data modeling concepts (Star, Snowflake schemas) . Experience with Agile methodology and tools like JIRA, Confluence, Azure DevOps . Knowledge of data profiling, data cleansing, and DQ tools (e.g., Informatica DQ, Talend DQ) is a plus. Good to Have: Knowledge of cloud data platforms like Azure Synapse, AWS Redshift, or Google BigQuery . Experience with Python or Shell scripting for test automation or data manipulation. Exposure to BI tools such as Power BI, Tableau, or Qlik for validating reporting layers. Understanding of data security, PII masking , and compliance requirements . Qualifications: Bachelor s degree in Computer Science, Engineering, or related field. ETL tool certifications or Data Engineering-related credentials (preferred but not mandatory). Location and way of working: Base location: Bangalore Hybrid is our default way of working. Each domain has customized the hybrid approach to their unique needs. How you ll grow Connect for impact Our exceptional team of professionals across the globe are solving some of the world s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report . Empower to lead You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership. Inclusion for all At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters. Drive your career At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte. Everyone s welcome entrust your happiness to us Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here s a glimpse of things that are in store for you. Interview tips We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you re applying to. Check out recruiting tips from Deloitte professionals.

Posted 1 week ago

Apply

18.0 - 20.0 years

6 - 9 Lacs

Bengaluru

Work from Office

Frontera Health is revolutionizing pediatric healthcare by developing a cutting-edge, tech-enabled platform that delivers essential therapies to rural families. Our platform leverages AI/ML to create a robust video-based data model for early intervention and developmental disorders. By collaborating closely with parents, caregivers, and clinical partners, we re bridging the gap in access to care, improving health equity, and providing personalized treatment plans. Backed by leading investors like Lightspeed and Lux, Frontera Health is poised for rapid growth. Our ABA direct services are designed to meet the unique needs of children in underserved communities, providing them with the support and resources they require to reach their full potential. We are passionate about ensuring that every child, regardless of their location or socioeconomic status, has access to high-quality healthcare. By leveraging our technology platform and partnering with local providers, we are able to deliver effective ABA therapy to families who may otherwise have limited access to these essential services. Make a Difference in Lives: Join Our Thriving ABA Team as a Registered Behavior Technician! Are you passionate about helping children and families navigate the challenges of Autism Spectrum Disorder? Do you find joy in seeing positive change through evidence-based practices? Frontera Health New Mexico is seeking dedicated and compassionate Behavior Technicians (BT) or Registered Behavior Technicians (RBT) to join our dynamic team. Why Frontera Health New Mexico? Make a real impact: Your work will directly influence the lives of children and families, helping them unlock their potential and achieve their goals. Join a supportive community: Were a passionate team of professionals who collaborate and learn from each other, creating a positive and encouraging environment. Career growth opportunities: We invest in ongoing training and development, so you can expand your skills and advance your career in Applied Behavior Analysis (ABA). Competitive compensation and benefits : We offer competitive pay, comprehensive benefits, and a chance to contribute to a mission-driven organization. Recognition for experience: Join Frontera Health New Mexico as a certified RBT and receive a retention bonus in appreciation of the skills and dedication you bring to the team. What Youll Do: Partner with Board Certified Behavior Analysts (BCBAs) to provide therapy that transforms lives. Create engaging and effective learning experiences for children with Autism Spectrum Disorder (ASD). Use clear communication and positive reinforcement to help children reach their full potential. Complete daily progress notes related to the implementation of the intervention plan. Create an environment that fosters skill acquisition, functional communication, and school readiness for children. Collaborate with families to implement strategies at home and in the community. Who You Are: Youre passionate about helping children and families. Youre a natural communicator and enjoy building relationships. Youre organized, detail-oriented, and committed to quality. Youre eager to learn and grow Required Qualifications: Proof of high school graduation (Diploma or GED) College enrollment or degree preferred At least 18 years of age Health & Safety Requirements: Reliable transportation Ability to lift 50 lbs, sit on floors and/or child-sized furniture, and quickly move from a seated position to running stance Training and Development: Personalized Development Opportunities: We believe in investing in our teams success. Youll have access to comprehensive training and resources to expand your skills and knowledge in ABA, tailored to your unique needs and goals. RBT Credential Support: Earning your RBT credential opens doors in the ABA field. We provide the resources and support you need to achieve this milestone, including 40-hour training, competency assessments, and exam support. Continuous Learning: Our supportive community fosters constant growth. Through regular team meetings, supervision with BCBAs, and ongoing learning opportunities, youll stay at the forefront of ABA practices. Empowering Environment: We believe in providing a collaborative and encouraging atmosphere where you can ask questions, seek guidance, and feel supported in your professional journey. Ready to make a difference? Wed love to hear from you! Apply today and join our team of dedicated professionals who are changing lives, one child at a time. Frontera Health, Inc. is committed to creating and maintaining a diverse, equitable, and inclusive workplace where everyone feels valued, respected, and has the opportunity to thrive. We believe that our differences make us stronger and that all employees, regardless of their background, experiences, or abilities, contribute to our success. We are committed to: Providing equal employment opportunities to all qualified individuals, without regard to race, color, religion, sex, national origin, disability status, sexual orientation, gender identity or expression, age, genetic information, veteran status, or any other characteristic protected by law. Fostering a culture of inclusion and belonging where everyone feels valued and respected. Providing reasonable accommodations to employees with disabilities. Continuously learning and improving our DE&I practices. We will achieve this commitment by: Recruiting and hiring a diverse workforce that reflects the communities we serve. Creating and maintaining an inclusive work environment that is free from discrimination and harassment. Actively listening to and addressing the needs and concerns of all employees. We believe that diversity, equity, and inclusion are essential to our success as a company and to our mission of serving the pediatric behavioral health community. We are committed to continuous improvement in this area and welcome feedback from all employees.

Posted 1 week ago

Apply

12.0 - 17.0 years

13 - 18 Lacs

Mumbai

Work from Office

Overview: We are seeking an experienced Data Architect with over 12 years of expertise in data engineering, big data, and cloud data solutions particularly on Microsoft Azure . The ideal candidate will have a proven track record of delivering scalable data architectures, building enterprise data lakes, leading complex migrations, and architecting real-time and batch data pipelines. You ll be responsible for end-to-end architecture from data ingestion and transformation to governance, analytics, and performance optimization. Key Responsibilities: Architecture & Design Design scalable, high-performance, cloud-native data architectures using Azure Data Lake, Azure Synapse, and Databricks . Develop high-level and low-level architecture documents (HLD/LLD) for modern data platforms. Define data models using star and snowflake schemas , optimizing for analytics and query performance. Data Engineering & ETL Lead the development of ETL/ELT pipelines using Azure Data Factory , PySpark , Spark SQL , and Databricks . Manage ingestion of structured and semi-structured data from diverse sources to Azure-based data lakes and warehouses. Implement real-time data pipelines using Azure Event Hubs and Structured Streaming . Governance & Security Define and implement data governance frameworks including lineage, cataloging, access controls , and compliance (e.g., GDPR ). Collaborate with MDM and governance teams using tools like Informatica AXON and EDC . Performance Tuning & Optimization Drive cost-efficient architecture design with partitioning, caching, indexing, and cluster optimization. Monitor and troubleshoot data pipelines using Azure Monitor , Log Analytics , and Databricks tools . Stakeholder Engagement Collaborate with data scientists, analysts, business stakeholders, and DevOps teams to deliver robust, scalable data platforms. Conduct design reviews and training sessions to support platform adoption and knowledge sharing. Key Skills & Technologies: Cloud Platforms: Azure (ADF, ADLS, Azure SQL, Synapse, Databricks), AWS (S3, RDS, EC2) Big Data: Spark, Delta Lake, PySpark, Hadoop ETL Tools: Azure Data Factory, Informatica, IBM DataStage Data Modeling: Star, Snowflake, SCD, Fact & Dimension Tables Programming: Python, PySpark, SQL, Shell Scripting, R Visualization Tools: Power BI, Tableau, Cognos Data Governance: Informatica MDM, AXON, EDC Certifications Preferred: Microsoft Certified: Azure Data Engineer Associate Databricks Data Engineer Associate / Professional

Posted 1 week ago

Apply

9.0 - 14.0 years

3 - 7 Lacs

Hyderabad, Pune

Work from Office

":" Strategic Responsibilities: Architect enterprise-grade solutions using Palantir Foundry and AIP Lead AI application development, including agentic AI for business process automation Own end-to-end solution lifecycle: design \u2192 development \u2192 deployment \u2192 production support Define DevOps and platform engineering standards for Foundry deployments Guide data governance, security, and CI/CD automation across teams Collaborate with global teams to build scalable frameworks and reusable templates Lead environment governance, versioning strategy, and platform upgrade planning. Act as a technical advisor to stakeholders, translating complex requirements into actionable solutions.

Posted 1 week ago

Apply

3.0 - 8.0 years

4 - 7 Lacs

Gurugram

Work from Office

Role Overview We are seeking an AWS Data Engineer to join our team. The ideal candidate will have strong experience in AWS cloud services, data engineering, and ETL processes. This role involves designing, implementing, and maintaining data pipelines, ensuring data quality, and collaborating with cross-functional teams. Key Responsibilities Design, develop and maintain scalable data pipelines using AWS services Implement ETL processes and ensure data quality and consistency Optimize data infrastructure for performance and cost Collaborate with data scientists and analysts to support their data needs Implement and maintain data security measures Document data architectures and processes Requirements Education Bachelors/Masters degree in Computer Science, Engineering, or related field Experience Minimum 3 years of experience with AWS data services Strong experience in data engineering and ETL processes Experience with big data technologies Technical Skills Expertise in AWS services (Redshift, S3, Glue, EMR, Lambda) Strong programming skills in Python and SQL Experience with data modeling and warehousing Knowledge of data security best practices Soft Skills Strong problem-solving and analytical skills Excellent communication and collaboration abilities Self-motivated and able to work independently

Posted 1 week ago

Apply

3.0 - 8.0 years

6 - 9 Lacs

Gurugram

Work from Office

Role Overview We are looking for an Azure Data Engineer to join our team. The ideal candidate will have strong experience in Microsoft Azure cloud services, data engineering, and ETL processes. This role involves designing, implementing, and maintaining data solutions on Azure, ensuring data quality, and collaborating with cross-functional teams. Key Responsibilities Design and implement data solutions using Azure services Develop and maintain ETL/ELT pipelines Optimize data architectures for performance and cost Ensure data security and compliance Collaborate with data scientists and analysts Document technical specifications and processes Requirements Education Bachelors/Masters degree in Computer Science, Engineering, or related field Experience Minimum 3 years of experience with Azure data services Strong background in data engineering Experience with cloud data platforms Technical Skills Expertise in Azure services (Synapse Analytics, Data Factory, Databricks) Strong SQL and Python programming skills Experience with data modeling and ETL Knowledge of data security best practices Soft Skills Strong analytical and problem-solving skills Excellent communication abilities Team collaboration capabilities

Posted 1 week ago

Apply

5.0 - 8.0 years

7 - 11 Lacs

Noida

Work from Office

Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary . In data analysis at PwC, you will focus on utilising advanced analytical techniques to extract insights from large datasets and drive datadriven decisionmaking. You will leverage skills in data manipulation, visualisation, and statistical modelling to support clients in solving complex business problems. & Summary Seeking a skilled and passionate React.js Developer to join our team with 58 years of experience. The ideal candidate should have a strong foundation in frontend development and a keen eye for delivering dynamic, highperformance web applications. You will collaborate closely with designers, backend developers, and product managers to build innovative solutions that enhance user experiences. Responsibilities Develop and maintain responsive web applications using React.js Collaborate with UI/UX designers to ensure technical feasibility of designs Write clean, maintainable, and scalable code, following best practices Integrate RESTful APIs and thirdparty services into applications Troubleshoot and debug application issues in a timely manner Mandatory skill sets ReactJs Angular Javascript HTML/CSS5 Preferred skill sets Years of experience required 58 years Education qualification BE/B.Tech/MBA/MCA Education Degrees/Field of Study required Master of Business Administration, Bachelor of Technology, Bachelor of Engineering Degrees/Field of Study preferred Required Skills Structured Query Language (SQL) Accepting Feedback, Accepting Feedback, Active Listening, Algorithm Development, Alteryx (Automation Platform), Analytical Thinking, Analytic Research, Big Data, Business Data Analytics, Communication, Complex Data Analysis, Conducting Research, Creativity, Customer Analysis, Customer Needs Analysis, Dashboard Creation, Data Analysis, Data Analysis Software, Data Collection, DataDriven Insights, Data Integration, Data Integrity, Data Mining, Data Modeling, Data Pipeline {+ 38 more} No

Posted 1 week ago

Apply

12.0 - 15.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Not Applicable Specialism Data, Analytics & AI Management Level Senior Manager & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decisionmaking and driving business growth. Those in intelligent automation at PwC will focus on conducting process mining, designing next generation small and largescale automation solutions, and implementing intelligent process automation, robotic process automation and digital workflow solutions to help clients achieve operational efficiencies and reduce costs. & Summary A career within Data and Analytics services will provide you with the opportunity to help organizations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organizational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organizations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities Overview We are seeking a highly skilled and experienced Azure Data Architect to design, develop, and manage scalable and secure data solutions on Microsoft Azure. The ideal candidate will have a deep understanding of cloud data architecture, modern data platforms, data governance, and analytics frameworks. You will collaborate closely with data engineers, analysts, and business stakeholders to enable datadriven decisionmaking. Key Responsibilities Design and implement endtoend data architectures on Microsoft Azure (e.g., Data Lake, Data Factory, Synapse Analytics, Databricks, etc.). Define and implement data strategies, roadmaps, and governance frameworks. Lead the development of scalable data pipelines, integration workflows, and ETL/ELT processes. Design and optimize Azure SQL Databases, Cosmos DB, and other storage solutions. Ensure security, privacy, and compliance of data solutions (e.g., rolebased access, data masking, encryption). Implement data modeling best practices for structured, semistructured, and unstructured data. Collaborate with crossfunctional teams to understand business requirements and translate them into data solutions. Monitor and troubleshoot data architecture performance and reliability issues. Guide and mentor data engineering teams on Azure best practices Evaluate new Azure services and tools for continuous improvement. Strong knowledge of o Azure Data Factory o Azure Data Lake Storage (Gen2) o Azure Synapse Analytics o Azure Databricks o Azure SQL Database / Cosmos DB o Azure Purview / Microsoft Fabric (optional) Mandatory skill sets Proficiency in data modeling, warehousing, and modern data lake house concepts. Familiarity with Python, SQL, Spark, and Power BI. Deep understanding of data governance, data quality, and data security principles. Strong communication and stakeholder management skills. Preferred skill sets Azure Data Engineer Associate (DP203) Azure Solutions Architect Expert (AZ305) Azure Enterprise Data Analyst Associate (DP500) Years of experience required 1215 Years Education qualification B.Tech / M.Tech / MBA / MCA Education Degrees/Field of Study required Bachelor of Technology, MBA (Master of Business Administration) Degrees/Field of Study preferred Required Skills Data Architecture Accepting Feedback, Accepting Feedback, Active Listening, Agile Methodology, Analytical Thinking, Automation Algorithms, Automation Engineering, Automation Framework Design and Development, Automation Programming, Automation Solutions, Automation Studio, Automation System Efficiency, Blue Prism, Business Analysis, Business Performance Management, Business Process Analysis, Business Process Automation (BPA), Business Transformation, Business Value Optimization, C++ Programming Language, Coaching and Feedback, Cognitive Automation, Communication, Conducting Discovery, Configuration Management (CM) {+ 44 more} No

Posted 1 week ago

Apply

8.0 - 13.0 years

13 - 18 Lacs

Hyderabad

Work from Office

Digital innovation Process Analyst M/F - India, Hyderabad - 161265 | Safran Digital innovation Process Analyst M/F Job Description Job Description We are seeking a highly experienced and data-driven Digital Innovation -Process Analyst to lead end-to-end analytical initiatives that drive operational excellence, automation, and strategic decision-making across the organization. This role blends business process optimization, advanced analytics, and Power BI development, requiring a strong mix of technical expertise, business acumen, and stakeholder engagement. The ideal candidate will be responsible for analyzing complex processes, visualizing performance through dashboards, and identifying automation opportunities. Key Responsibilities | Lean /Innovation specialist reports to MRO Lean Manager responsible for: Support Lean Manager to execute continuous improvement activities and advanced technology for MRO. Identify/eliminate waste and inefficiencies in processes and technologies through lean /Innovation methods. Mentor cross functional teams on Kaizen, digital tools and various problem-solving methods. Work with Safran digital teams to implement the new digital technology tools for the MRO. Prepare and deliver trainings on digital technology and methodologies to all new employees. Liaise with MRO ERP process owners and continually improve the quality of technology for the MRO. Lead root cause analysis using data from multiple systems. Design and maintain operational KPIs and performance dashboards to monitor business health and identify improvement areas. Develop business cases with ROI projections, cost-saving opportunities, and productivity benchmarks for improvement initiatives. Translate complex data into compelling stories and visualizations to drive stakeholder engagement and executive decision-making. Perform process modeling and process conformance analysis for business process management and optimization. Drive value enablement and roadmap execution for transformation programs across value streams. Collaborate on AI/ML-powered analytics projects to enhance business outcomes through predictive insights. Ensure seamless project execution with change management strategies to promote adoption and behavioral alignment. Power BI Responsibilities Design and deliver interactive and performance-optimized Power BI dashboards and reports. Develop robust data models, calculated columns, and DAX measures aligned to business KPIs. Build and optimize ETL pipelines using Power Query (M) for structured and unstructured data sources. Manage role-based security (RLS) and user access within Power BI Service. Maintain and publish reports, configure scheduled refreshes, and manage on-premise data gateways. Create user-friendly documentation and training materials for end users. Stay current with Power BI features and advocate for continuous improvement. Job Requirements 8+ years of experience in Business Intelligence, Data Analytics, and Reporting, with deep expertise in Power BI. Strong proficiency in DAX, Power Query (M), and data modeling best practices. Hands-on experience with cloud-based analytics platforms, preferably Microsoft Azure. Strong knowledge of SQL and relational databases (Oracle, SQL Server), as well as handling unstructured data. Familiarity with the Power Platform ecosystem (Power Apps, Power Automate) is a plus. Excellent analytical, logical reasoning, and problem-solving skills. Strong communication and stakeholder management capabilities, especially in remote or distributed team settings. Ability to translate business challenges into actionable, data-driven insights. Company Information Safran is an international high-technology group, operating in the aviation (propulsion, equipment and interiors), defense and space markets. Its core purpose is to contribute to a safer, more sustainable world, where air transport is more environmentally friendly, comfortable and accessible. Safran has a global presence, with 100,000 employees and sales of 27.3 billion euros in 2024, and holds, alone or in partnership, world or regional leadership positions in its core markets. Safran is in the 2nd place in the aerospace and defense industry in TIME magazines "Worlds best companies 2024" ranking. Safran Aircraft Engines designs, produces and sells, alone or in partnership, commercial and military aircraft engines offering world-class performance, reliability and environmental compliance. Through CFM International*, Safran Aircraft Engines is the worlds leading supplier of engines for single-aisle mainline commercial jets. * CFM International is a 50/50 joint venture between Safran Aircraft Engines and GE Aerospace Number of countries where Safran is located

Posted 1 week ago

Apply

4.0 - 8.0 years

10 - 11 Lacs

Hyderabad

Work from Office

Job Position Summary The MetLife Corporate Technology (CT) organization is evolving to enable MetLife s New Frontier strategy. With a strong vision in place, we are a global function focused on driving digital technology strategies for key corporate functions within MetLife including, Finance, Actuarial, Reinsurance, Legal, Human Resources, Employee Experience, Risk, Treasury, Audit and Compliance. In partnership with our business leaders, we develop and deliver seamless technology experiences to our employees across the entire employee lifecycle. Our vision and mission is to create innovative, transformative and contemporary technology solutions to empower our leaders and employees so they can focus on what matters most, our customers. We are technologists with strong business acumen focused on developing our talent to continually transform and innovate. We are seeking a highly motivated and skilled Azure Data Engineer to join our growing team in Hyderabad. This position is perfect for talented professionals with 4-8 years of experience in designing, building, and maintaining scalable cloud-based data solutions. As an Azure Data Engineer at MetLife, you will collaborate with cross-functional teams to enable data transformation, analytics, and decision-making by leveraging Microsoft Azure s advanced technologies. He/she should be a strategic thinker, an effective communicator, and an expert in technological development. Key Relationships Internal Stake Holder Key Responsibilities Design, develop, and maintain efficient and scalable data pipelines using Azure Data Factory (ADF) for ETL/ELT processes. Build and optimize data models and data flows in Azure Synapse Analytics, SQL Databases, and Azure Data Lake. Work with large datasets to define, test, and implement data storage, transformation, and processing strategies using Azure-based services. Create and manage data pipelines for ingesting, processing, and transforming data from various sources into a structured format. Develop solutions for real-time and batch processing using tools like Azure Stream Analytics and Event Hubs. Implement data security, governance, and compliance measures to ensure the integrity and accessibility of the organization s data assets. Contribute to the migration of on-premises databases and ETL processes to Azure cloud. Build processes to identify, monitor, and resolve data inconsistencies and quality issues. Collaborate with data architects, business analysts, and developers to deliver reliable and performant data solutions aligned with business requirements. Monitor and optimize performance and cost of Azure-based data solutions. Document architectures, data flows, pipelines, and implementations for future reference and knowledge sharing. Knowledge, Skills, and Abilities Education A Bachelors/masters degree in computer science or equivalent Engineering degree. Candidate Qualifications: Education: Bachelors degree in computer science, Information Systems or related field Experience: Required: 4-8 years of experience in data engineering, with a strong focus on Azure-based services. Proficiency in Azure Data Factory (ADF) , Azure Synapse Analytics, Azure Data Lake, and Azure SQL Databases. Strong knowledge of data modeling, ETL/ELT processes , and data pipeline design. Hands-on experience with Python, SQL, and Spark for data manipulation and transformation. Exposure to big data platforms like Hadoop, Databricks, or similar technologies. Experience with real-time data streaming using tools like Azure Stream Analytics, Event Hubs , or Service Bus. Familiarity with data governance, best practices, and security protocols within cloud environments. Solid understanding of Azure DevOps for CI/CD pipelines around data workflows. Strong problem-solving skills with attention to detail and a results-driven mindset. Excellent collaboration, communication, and interpersonal skills for working with cross-functional teams. Preferred: Demonstrated experience in end-to-end cloud data warehouse migrations . Familiarity with Power BI or other visualization tools for creating dashboards and reports. Certification in Azure Data Engineer Associate or Azure Solutions Architect is a plus. Understanding of machine learning concepts and integrating AI/ML pipelines is an advantage. Skills and Competencies: Language: Proficiency at business level in English. Competencies: Communication: Ability to influence and help communicate the organization s direction and ensure results are achieved Collaboration: Proven track record of building collaborative partnerships and ability to operate effectively in a global environment Diverse environment: Can-do attitude and ability to work in a high paced environment Tech Stack Development & Delivery Methods: Agile (Scaled Agile Framework) DevOps and CI/CD: Azure DevOps Development Frameworks and Languages: SQL Spark Python Azure: Functional Knowledge of cloud based solutions

Posted 1 week ago

Apply

4.0 - 8.0 years

10 - 11 Lacs

Hyderabad

Work from Office

Job Position Summary The MetLife Corporate Technology (CT) organization is evolving to enable MetLife s New Frontier strategy. With a strong vision in place, we are a global function focused on driving digital technology strategies for key corporate functions within MetLife including, Finance, Actuarial, Reinsurance, Legal, Human Resources, Employee Experience, Risk, Treasury, Audit and Compliance. In partnership with our business leaders, we develop and deliver seamless technology experiences to our employees across the entire employee lifecycle. Our vision and mission is to create innovative, transformative and contemporary technology solutions to empower our leaders and employees so they can focus on what matters most, our customers. We are technologists with strong business acumen focused on developing our talent to continually transform and innovate. We are seeking a skilled Oracle HCM Technical Consultant to join our dynamic team in Hyderabad. This role is ideal for professionals with 4-8 years of experience who are passionate about leveraging Oracle HCM technology to solve business challenges. As a part of this role, you will be partnering with cross-functional teams, play a pivotal role in system implementation and enhancement, and contribute to optimizing HR processes. He/she should be a strategic thinker, an effective communicator, and an expert in technological development. Key Relationships Internal Stake Holder Key Responsibilities Develop and customize Oracle HCM solutions to meet business requirements, including workflows, reports, integration interfaces, extensions, and configurations. Support the implementation, maintenance, and enhancement of HCM modules such as Core HR, Payroll, Absence Management, Talent Management, Benefits, or Learning Management. Collaborate with HR stakeholders and functional teams to understand requirements, gather specifications, and design tailored solutions. Troubleshoot and resolve technical issues in Oracle HCM applications, ensuring system continuity and optimal performance. Create technical documentation, including system design, configuration, and testing scenarios. Develop and manage data migration activities, ensuring accuracy and consistency of data during conversion to Oracle HCM. Build and maintain integrations with third-party applications using tools like Oracle Integration Cloud (OIC) or SOA. Assist in regular system upgrades, patches, and testing to ensure compliance with the latest Oracle releases. Provide technical expertise and support on Oracle HCM reporting tools, including OTBI, BI Publisher, and HCM Extracts. Work collaboratively with internal teams and vendor resources to meet project deadlines Knowledge, Skills, and Abilities Education A Bachelors/masters degree in computer science or equivalent Engineering degree. Candidate Qualifications: Education: Bachelors degree in computer science, Information Systems or related field Experience: Required: 4-8 years of experience as a technical consultant working with Oracle HCM Cloud applications. Strong expertise in Oracle HCM modules (Core HR, Payroll, Talent, Absence, etc.) and a solid understanding of business processes. Proficiency in Oracle SQL, PL/SQL, BI Publisher, HCM Extracts, Fast Formulas, and OTBI reports. Hands-on experience with Oracle Integration tools (Oracle Integration Cloud or SOA Suite). Knowledge of data modeling, migration, and transformation techniques. Ability to develop and troubleshoot custom reports, interfaces, and extensions. Strong problem-solving, analytical, and technical troubleshooting skills. Excellent communication and interpersonal skills for effective collaboration with stakeholders. Experience working in agile and project-based environments is preferred. Oracle HCM certifications are a plus but not mandatory. Preferred: Experience integrating Oracle HCM with on-premise or third-party systems. Familiarity with HR workflows, policies, and compliance requirements. Ability to manage systems testing including unit, QA, end to end and user acceptance testing Experience managing vendors to SLA s. Proven experience collaborating with peers to establish best practices to achieve high service levels. Skills and Competencies: Language: Proficiency at business level in English. Competencies: Communication: Ability to influence and help communicate the organization s direction and ensure results are achieved Collaboration: Proven track record of building collaborative partnerships and ability to operate effectively in a global environment Diverse environment: Can-do attitude and ability to work in a high paced environment Tech Stack Development & Delivery Methods: Agile (Scaled Agile Framework) Development Frameworks and Languages: SQL, PL/SQL Reporting Tools: OTBI, BI Publisher Integrations: Oracle Integration Cloud (OIC) or SOA HCM Extracts Oracle HCM: Functional Knowledge of Payroll, Benefits, Time and Labor, Absence, Learning, Performance and/or Compensation

Posted 1 week ago

Apply

4.0 - 10.0 years

4 - 8 Lacs

Gurugram, Bengaluru

Work from Office

Analyze and transform data science prototypes Develop ML applications as per requirements Run machine learning tests and experiments Perform analysis and fine-tuning of models using test results Train and retrain ML models when necessary Skills: Experience working on Cloud platforms, preferably AWS Proven experience as Machine Learning Engineer Experience in AWS Sagemaker Understanding of data structures, data modeling and software architecture Ability to write robust code in Python Familiarity with machine learning frameworks (like Keras or PyTorch) and libraries (like scikit-learn) Excellent communication skills Aws, Ml

Posted 1 week ago

Apply

15.0 - 24.0 years

35 - 45 Lacs

Mumbai, Bengaluru, Mumbai (All Areas)

Work from Office

Greetings!!! This is in regards to a Job opportunity for Data Architect with Datamatics Global Services Ltd. Position: Data Architect Website: https://www.datamatics.com/ Job Location: Mumbai(Andheri - Seepz)/Bangalore(Kalyani Neptune Bannerghatta Road) Job Description: Job Overview: We are seeking a Data Architect to lead end-to-end solutioning for enterprise data platforms while driving strategy, architecture, and innovation within our Data Center of Excellence (COE). This role requires deep expertise in Azure, Databricks, SQL, and Python, alongside strong pre-sales and advisory capabilities. The architect will serve as a trusted advisor, mentoring and guiding delivery teams, and defining scalable data strategies that align with business objectives. Key Responsibilities: Core Engineering Data Architecture & Solutioning - Design and implement enterprise-wide data architectures, ensuring scalability, security, and performance. - Lead end-to-end data solutioning, covering ingestion, transformation, governance, analytics, and visualization. - Architect high-performance data pipelines leveraging Azure Data Factory, Databricks, SQL, and Python. - Establish data governance frameworks, integrating Delta Lake, Azure Purview, and metadata management best practices. - Optimize data models, indexing strategies, and high-volume query processing. - Oversee data security, access controls, and compliance policies within cloud environments. - Mentor engineering teams, guiding best practices in data architecture, pipeline development, and optimization. Data COE & Thought Leadership - Define data architecture strategies, frameworks, and reusable assets for the Data COE. - Drive best practices, standards, and innovation across data engineering and analytics teams. - Act as a subject matter expert, shaping data strategy, scalability models, and governance frameworks. - Lead data modernization efforts, advising on cloud migration, system optimization, and future-proofing architectures. - Deliver technical mentorship, ensuring teams adopt cutting-edge data engineering techniques. - Represent the Data COE in industry discussions, internal training, and thought leadership sessions. Pre-Sales & Solution Advisory - Engage in pre-sales consulting, defining enterprise data strategies for prospects and existing customers. - Craft solution designs, architecture blueprints, and contribute to proof-of-concept (PoC) implementations. - Partner with sales and consulting teams to translate client needs into scalable data solutions. - Provide strategic guidance on Azure, Databricks, and cloud adoption roadmaps. - Present technical proposals and recommendations to executive stakeholders and customers. - Stay ahead of emerging cloud data trends to enhance solution offerings. Required Skills & Qualifications: - 15+ years of experience in data architecture, engineering, and cloud data solutions. - Proven expertise in Azure, Databricks, SQL, and Python as primary technologies. - Proficiency in other relevant cloud and data engineering tools based on business needs. - Deep knowledge of data governance, metadata management, and security policies. - Strong pre-sales, consulting, and solution advisory experience in enterprise data platforms. - Advanced skills in SQL optimization, data pipeline architecture, and high-scale analytics. - Leadership experience in mentoring teams, defining best practices, and driving thought leadership. - Expertise in Delta Lake, Azure Purview, and scalable data architectures. - Strong stakeholder management skills across technical and business domains. Preferred but Not Mandatory: - Familiarity with Microsoft Fabric and Power BI data accessibility techniques. - Hands-on experience with CI/CD for data pipelines, DevOps, and version control practices. Additional Notes: - The technologies listed above are primary but indicative. - The candidate should have the flexibility to work with additional tools and platforms based on business needs.

Posted 1 week ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Bengaluru

Hybrid

Job title: Senior Software Engineer Experience: 5- 8 years Primary skills: Python, Spark or Pyspark, DWH ETL. Database: SparkSQL or PostgreSQL Secondary skills: Databricks ( Delta Lake, Delta tables, Unity Catalog) Work Model: Hybrid (Weekly Twice) Cab Facility: Yes Work Timings: 10am to 7pm Interview Process: 3 rounds (3rd round F2F Mandatory) Work Location: Karle Town Tech Park Nagawara, Hebbal Bengaluru 560045 About Business Unit: The Architecture Team plays a pivotal role in the end-to-end design, governance, and strategic direction of product development within Epsilon People Cloud (EPC). As a centre of technical excellence, the team ensures that every product feature is engineered to meet the highest standards of scalability, security, performance, and maintainability. Their responsibilities span across architectural ownership of critical product features, driving techno-product leadership, enforcing architectural governance, and ensuring systems are built with scalability, security, and compliance in mind. They design multi cloud and hybrid cloud solutions that support seamless integration across diverse environments and contribute significantly to interoperability between EPC products and the broader enterprise ecosystem. The team fosters innovation and technical leadership while actively collaborating with key partners to align technology decisions with business goals. Through this, the Architecture Team ensures the delivery of future-ready, enterprise-grade, efficient and performant, secure and resilient platforms that form the backbone of Epsilon People Cloud. Why we are looking for you: You have experience working as a Data Engineer with strong database fundamentals and ETL background. You have experience working in a Data warehouse environment and dealing with data volume in terabytes and above. You have experience working in relation data systems, preferably PostgreSQL and SparkSQL. You have excellent designing and coding skills and can mentor a junior engineer in the team. You have excellent written and verbal communication skills. You are experienced and comfortable working with global clients You work well with teams and are able to work with multiple collaborators including clients, vendors and delivery teams. You are proficient with bug tracking and test management toolsets to support development processes such as CI/CD. What you will enjoy in this role: As part of the Epsilon Technology practice, the pace of the work matches the fast-evolving demands in the industry. You will get to work on the latest tools and technology and deal with data of petabyte-scale. Work on homegrown frameworks on Spark and Airflow etc. Exposure to Digital Marketing Domain where Epsilon is a marker leader. Understand and work closely with consumer data across different segments that will eventually provide insights into consumer behaviour's and patterns to design digital Ad strategies. As part of the dynamic team, you will have opportunities to innovate and put your recommendations forward. Using existing standard methodologies and defining as per evolving industry standards. Opportunity to work with Business, System and Delivery to build a solid foundation on Digital Marketing Domain. The open and transparent environment that values innovation and efficiency Click here to view how Epsilon transforms marketing with 1 View, 1 Vision and 1 Voice. What will you do? Develop a deep understanding of the business context under which your team operates and present feature recommendations in an agile working environment. Lead, design and code solutions on and off database for ensuring application access to enable data-driven decision making for the company's multi-faceted ad serving operations. Working closely with Engineering resources across the globe to ensure enterprise data warehouse solutions and assets are actionable, accessible and evolving in lockstep with the needs of the ever-changing business model. This role requires deep expertise in spark and strong proficiency in ETL, SQL, and modern data engineering practices. Design, develop, and manage ETL/ELT pipelines in Databricks using PySpark/SparkSQL, integrating various data sources to support business operations Lead in the areas of solution design, code development, quality assurance, data modelling, business intelligence. Mentor Junior engineers in the team. Stay abreast of developments in the data world in terms of governance, quality and performance optimization. Able to have effective client meetings, understand deliverables, and drive successful outcomes. Qualifications: Bachelor's Degree in Computer Science or equivalent degree is required. 5 - 8 years of data engineering experience with expertise using Apache Spark and Databases (preferably Databricks) in marketing technologies and data management, and technical understanding in these areas. Monitor and tune Databricks workloads to ensure high performance and scalability, adapting to business needs as required. Solid experience in Basic and Advanced SQL writing and tuning. Experience with Python Solid understanding of CI/CD practices with experience in Git for version control and integration for spark data projects. Good understanding of Disaster Recovery and Business Continuity solutions Experience with scheduling applications with complex interdependencies, preferably Airflow Good experience in working with geographically and culturally diverse teams. Understanding of data management concepts in both traditional relational databases and big data lakehouse solutions such as Apache Hive, AWS Glue or Databricks. Excellent written and verbal communication skills. Ability to handle complex products. Good communication and problem-solving skills, with the ability to manage multiple priorities. Ability to diagnose and solve problems quickly. Diligent, able to multi-task, prioritize and able to quickly change priorities. Good time management. Good to have knowledge of cloud platforms (cloud security) and familiarity with Terraform or other infrastructure-as-code tools. About Epsilon: Epsilon is a global data, technology and services company that powers the marketing and advertising ecosystem. For decades, we have provided marketers from the world's leading brands the data, technology and services they need to engage consumers with 1 View, 1 Vision and 1 Voice. 1 View of their universe of potential buyers. 1 Vision for engaging each individual. And 1 Voice to harmonize engagement across paid, owned and earned channels. Epsilon's comprehensive portfolio of capabilities across our suite of digital media, messaging and loyalty solutions bridge the divide between marketing and advertising technology. We process 400+ billion consumer actions each day using advanced AI and hold many patents of proprietary technology, including real-time modeling languages and consumer privacy advancements. Thanks to the work of every employee, Epsilon has been consistently recognized as industry-leading by Forrester, Adweek and the MRC. Epsilon is a global company with more than 9,000 employees around the world.

Posted 1 week ago

Apply

1.0 - 7.0 years

12 - 16 Lacs

Hyderabad

Work from Office

Skillsoft is seeking an experienced Data Integration Engineer to support and modernize our data integration processes. This role is responsible for managing the traditional ETL lifecycle while driving the transition to event-driven, API- based solutions. The ideal candidate will support existing systems while driving operational excellence and modernization initiatives. Opportunity Highlights: ETL Development & Data Management Design, develop, and optimize ETL processes to integrate data from multiple sources. Ensure the data integrity, accuracy, and security across all integration workflows. Troubleshoot and resolve ETL job failures, optimizing performance and throughput. Database Administration & Support: Support schema design, indexing strategies, and query optimization for efficient data retrieval . Provide database administration support for ETL workflows and integration projects. Modernization & Innovation: Drive the transition from traditional ETL processes to modern, event-driven, API-based data integration solutions. Develop and implement strategies for data process modernization. Explore and implement AI/ML-driven automation for API-based integration workflows. Stay updated with the latest trends and technologies in data integration and apply them to improve existing systems. Operational Excellence: Support and maintain existing data integration systems. Optimize data pipelines for performance and efficiency. Collaborate with cross-functional teams to understand data needs and deliver effective solutions. Define and monitor KPIs for data integration and database performance. Skills & Qualifications Proven experience in managing traditional ETL lifecycles. Strong knowledge of event-driven architectures and API-based data integration. Proficiency in SQL and experience with database management systems. Ability to create and modify C# scripts within SSIS for custom API integrations. Experience with cloud-based data integration tools and platforms. Experience in working with Agile/Scrum environments. Effective communication and collaboration skills. Ability to manage multiple priorities and deliver in a fast-paced environment. A passion for innovation and continuous improvement. 5-10 years of experience in ETL development, data integration, and database administration.

Posted 1 week ago

Apply

15.0 - 24.0 years

20 - 35 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

About Client Hiring for One of the Most Prestigious Multinational Corporations! Job Description Job Title : Senior Manager- ServiceNow Platform Technical Architect Qualification : BE / B.tech Relevant Experience : 15+ Years Must Have Skills : The ServiceNow Technical Architect is responsible for designing and implementing scalable, secure, and efficient ServiceNow solutions that align with business objectives. This role involves assessing current architecture, re-defining platform architecture aligned with best practices, integration requirement with enterprise systems, and providing technical expertise to the development teams. Key Responsibilities Solution Architecture & Design Define and implement scalable, performant ServiceNow solutions that align with business needs. Develop and maintain ServiceNow architecture standards, best practices, and governance frameworks. Ensure technical alignment with enterprise IT strategies, security policies, and compliance requirements. Customize and implement SRE practices for platform and associated products/applications Technical Leadership & Governance Provide architectural guidance and technical mentorship to ServiceNow developers and administrators. Conduct code reviews, technical audits, and system performance optimizations. Establish governance policies for instance management, upgrades, and platform security. Platform Development & Customization Oversee the customization and configuration of ServiceNow applications and modules. Review custom scripts, business rules, UI policies, workflows, and integrations from Service now best practices standpoint Utilize ServiceNow development tools such as Flow Designer, UI Builder, and Glide APIs. Integration & Automation Design and implement integrations with third-party systems using REST, SOAP, LDAP, and MID Server. Optimize workflows and automation using Integration Hub and Orchestration. Ensure seamless data flow between ServiceNow and enterprise applications. Collaboration & Stakeholder Management Work closely with business analysts, product owners, and IT teams to translate business requirements into technical solutions. Provide expert consultation on ServiceNow capabilities, roadmap, and implementation strategies. -Communicate technical concepts effectively to both technical and non-technical stakeholders. Platform Maintenance & Upgrades Manage ServiceNow upgrades, patching, and instance performance monitoring. Ensure minimal disruption and adherence to ServiceNows upgrade best practices. Stay up to date with new ServiceNow releases and features. Required Qualifications 18-20 years of total software engineering/ IT experience.15+ years of ServiceNow development/Ops & Technical architecture experience. ServiceNow Expertise: Deep understanding of ServiceNow architecture, data model, and development best practices. Development Skills: Proficiency in JavaScript, Glide APIs, Flow Designer, andUI development. Integration Knowledge: Experience with REST, SOAP, JSON, LDAP, OAuth, and integration middleware. Cloud & Security: Understanding of cloud infrastructure, IT security, and compliance frameworks. Certifications: ServiceNow Certified Technical Architect (Must have). ServiceNow Certified System Administrator (CSA), Certified Implementation Specialist (CIS), Certified Application Developer (CAD), Preferred Qualifications Experience in working in Agile/Scrum environments. Familiarity with ITIL framework and best practices. Strong problem-solving, analytical, and communication skills. Location : Hyderabad CTC Range : As per market standards Notice period : Immediate Shift Timing : General Shift Mode of Interview : Virtual Mode of Work : Work from office Bhuvaneshwari S Senior Specialist Black and White outsourcing Pvt Ltd Bangalore, Karnataka,INDIA. bhuvaneshwari@blackwhite.in | www.blackwhite.in

Posted 1 week ago

Apply

5.0 - 8.0 years

6 - 10 Lacs

Chandigarh

Work from Office

Design, develop & optimize complex databases. Create, deploy & maintain interactive Power BI reports & dashboards to meet business intelligence requirements. Develop SQL queries, stored procedures, functions to extract, manipulate & analyse data. Required Candidate profile Power BI Developer with 5+ years experience in building dynamic dashboards, interactive reports & data models. Microsoft Certified Power BI Data Analyst Associate. Strong knowledge of SQL,T-SQL & DMS.

Posted 1 week ago

Apply

3.0 - 8.0 years

0 Lacs

noida, uttar pradesh

On-site

You should have a total of 8+ years of development/design experience, with a minimum of 3 years experience in Big Data technologies on-prem and on cloud. Proficiency in Snowflake and strong SQL programming skills are required. In addition, you should have strong experience with data modeling and schema design, as well as extensive experience using Data warehousing tools like Snowflake, BigQuery, or RedShift. Experience with BI Tools like Tableau, QuickSight, or PowerBI is a must, with at least one tool being a requirement. You should also have strong experience implementing ETL/ELT processes and building data pipelines, including workflow management, job scheduling, and monitoring. A good understanding of Data Governance, Security and Compliance, Data Quality, Metadata Management, Master Data Management, and Data Catalog is essential. Moreover, you should have a strong understanding of cloud services (AWS or Azure), including IAM and log analytics. Excellent interpersonal and teamwork skills are necessary for this role, as well as experience with leading and mentoring other team members. Good knowledge of Agile Scrum and communication skills are also required. As part of the job responsibilities, you will be expected to perform the same tasks as mentioned above. At GlobalLogic, we prioritize a culture of caring. From day one, you will experience an inclusive culture of acceptance and belonging, where you will have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. We are committed to your continuous learning and development. You will learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you will have the chance to work on projects that matter and engage your curiosity and creative problem-solving skills. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. GlobalLogic is a high-trust organization where integrity is key. By joining us, you are placing your trust in a safe, reliable, and ethical global company. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world's largest and most forward-thinking companies. Since 2000, we have been at the forefront of the digital revolution, helping create innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

kolkata, west bengal

On-site

As a skilled Salesforce Developer with 4-6 years of experience, you will be responsible for designing, developing, and implementing custom Salesforce applications. Your expertise in Apex, Visualforce, Lightning Components, and integrations will be crucial in enhancing our Salesforce environment. Collaborating with cross-functional teams, you will play a key role in ensuring the success of our Salesforce projects. Your key responsibilities will include: - Designing, developing, testing, and deploying custom Salesforce applications using Apex, Visualforce, Lightning Components, LWC, and other Salesforce technologies. - Configuring Salesforce to align with business requirements, including creating and modifying objects, Lightning flows, validation rules, and approval processes. - Integrating Salesforce with other systems and third-party applications through REST/SOAP APIs, middleware, and ETL tools. - Managing data integrity by implementing data migration and data cleansing strategies, as well as overseeing data modeling, security, and backups. - Creating and maintaining technical documentation, such as design specifications, deployment plans, and system architecture diagrams. - Providing ongoing support and troubleshooting for Salesforce-related issues, including debugging and performance optimization. - Ensuring adherence to Salesforce development best practices by conducting code reviews, testing, and maintaining code quality standards. Required Qualifications: - Bachelor's degree in Computer Science, Information Technology, or a related field. - 4-6 years of hands-on experience in Salesforce development. - Salesforce Certified Platform Developer I or II. Technical Skills: - Proficiency in Apex, Visualforce, Lightning Components, LWC, and SOQL. - Experience with Salesforce integrations using REST/SOAP APIs and familiarity with middleware tools. - Knowledge of Salesforce declarative tools such as Lightning Flow and Workflow Rules. - Understanding of the Salesforce security model, including profiles, roles, and sharing rules. - Familiarity with CI/CD tools for Salesforce, such as Jenkins or Salesforce DX. - Experience with Agile/Scrum development methodologies. If you are a proactive, detail-oriented Salesforce Developer with a passion for innovation and a commitment to excellence, we invite you to join our dynamic team and contribute to the success of our Salesforce projects.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

kolkata, west bengal

On-site

At EY, the focus is on shaping your future with confidence. By joining EY, you become part of a globally connected powerhouse of diverse teams that can propel your career in any direction you desire. EY's mission is to contribute to building a better working world. To excel in this role, you should possess the following qualifications: - Demonstrated expertise of 5 to 7 years in Power BI, with a deep understanding of DAX and the Power Query formula language (M-language). - Advanced knowledge of data modeling, data warehousing, and ETL techniques. - Proficiency in designing, developing, and maintaining Power BI reports and dashboards, including paginated reports, to facilitate business decision-making processes. - Experience in creating and implementing Power BI data models for intricate and large-scale enterprise environments. - Proven track record in deploying and optimizing large datasets effectively. - Proficiency in SQL and other data querying languages. - Strong collaboration, analytical, interpersonal, and communication skills. Ideally, candidates for this role should also have: - A Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field. - Microsoft Power BI certification. - Experience with other Business Intelligence (BI) tools. - Familiarity with Enterprise Data products such as Databricks and MS Fabric would be advantageous. - Previous successful collaboration within large teams to implement Power BI solutions. - Sound knowledge of the software development lifecycle and experience with Git. - Ability to propose solutions based on best practices derived from Microsoft documentation, whitepapers, and community publications. EY is dedicated to building a better working world by generating new value for clients, people, society, and the planet, while instilling trust in capital markets. Using data, artificial intelligence (AI), and advanced technology, EY teams assist clients in shaping the future with confidence and crafting solutions for the most critical issues of today and tomorrow. EY operates across a wide range of services in assurance, consulting, tax, strategy, and transactions. Leveraging sector insights, a globally connected multi-disciplinary network, and diverse ecosystem partners, EY offers services in over 150 countries and territories.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

As a Senior Specialist in Data Science at our company, you will have the opportunity to leverage analytics and technology to drive decision-making in order to address some of the world's greatest health threats. You will be part of the Insights, Analytics, and Data organization, working with partners in various Therapeutic and Domain areas to create scalable and production-grade analytics solutions. Your role will involve collaborating with Market leaders to tackle critical business questions using data science solutions and translating business queries into analytical problems. We are looking for candidates with prior experience in healthcare analytics or consulting sectors, leading Data Science teams, and delivering end-to-end data science projects. You should have a thorough understanding of Physician and Patient-level data from leading vendors and extensive experience in commercial pharma analytics. Effective communication skills are crucial as you will be interfacing with executive and business stakeholders. As a Senior Specialist in Data Science, you should have a solid foundation in statistics and machine learning and be able to work in high-performance computing environments. You should be self-motivated, with the ability to think independently and structure your data science approach according to the task at hand. Collaboration, continuous learning, and effective communication are key aspects of this role. Key Responsibilities: - Lead a moderate-sized team of Data Scientists to solve complex business problems - Collaborate with business leaders to define and prioritize business problems and conceptualize data science solutions - Standardize and scale data science solutions to increase delivery efficiency - Collaborate with cross-functional teams to design and implement solutions meeting business requirements - Present findings to senior business stakeholders and ensure technical and professional development of junior team members - Develop expertise in the therapeutic area of interest and contribute to thought leadership through publications and presentations Minimum Qualifications: - Bachelor's degree with 8-10 years of industry experience - Extensive experience in healthcare analytics or consulting sectors - Strong Python/R, SQL, Excel skills - Strong foundation in statistics and machine learning Preferred Qualifications: - Advanced degree in STEM (MS, MBA, PhD) - Experience in Oncology/Vaccine/Pharma & Rare Diseases therapeutic area commercial analytics - Experience in End to End Program Management Join us in our mission to put patients first and bring breakthrough medicines to customers worldwide. We are committed to fostering an inclusive and diverse workplace where diverse ideas come together for innovative solutions. We are an equal opportunity employer and encourage respectful challenge and collective problem-solving. Apply now if you meet the qualifications and are passionate about making a difference in the world of healthcare analytics and data science.,

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies