Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
15 - 25 Lacs
Bengaluru
Hybrid
Position: Data Engineer Skills Required: Experience in Python/Pyspark, Strong SQL Server. Good to have: Azure DataBricks (ADF) (or) Azure Synapse or Snowflake.
Posted 1 week ago
6.0 - 11.0 years
20 - 27 Lacs
Pune
Hybrid
6+yrs of experience as a Data Engineer, Expertise in the Azure platform Azure SQL DB, ADF and Azure Synapse, 5+ years of experience in database development using SQL, knowledge of data modeling, ETL processes and data warehouse design principles.
Posted 1 week ago
5.0 years
8 - 12 Lacs
Hyderabad
Work from Office
When our values align, there's no limit to what we can achieve. At Parexel, we all share the same goal - to improve the world's health. From clinical trials to regulatory, consulting, and market access, every clinical development solution we provide is underpinned by something special - a deep conviction in what we do. Each of us, no matter what we do at Parexel, contributes to the development of a therapy that ultimately will benefit a patient. We take our work personally, we do it with empathy and we're committed to making a difference. Key Accountabilities : Using Microsoft Azure data PaaS services, design, build, modify, and support data pipelines leveraging DataBricks and PowerBI in a medallion architecture setting. If necessary, create prototypes to validate proposed ideas and solicit input from stakeholders. Excellent grasp of and expertise with test-driven development and continuous integration processes. Analysis and Design – Converts high-level design to low-level design and implements it. Collaborate with Team Leads to define/clarify business requirements, estimate development costs, and finalize work plans. Run unit and integration tests on all created code – Create and run unit and integration tests throughout the development lifecycle Benchmark application code proactively to prevent performance and scalability concerns. Collaborate with the Quality Assurance Team on issue reporting, resolution, and change management. Support and Troubleshooting – Assist the Operations Team with any environmental issues that arise during application deployment in the Development, QA, Staging, and Production environments. Assist other teams in resolving issues that may develop as a result of applications or the integration of multiple component. Knowledge and Experience : Understanding of design concepts and architectural basics. Knowledge of performance engineering. Understanding of quality processes and estimate methods. Fundamental grasp of the project domain. The ability to transform functional and nonfunctional needs into system requirements. The ability to develop and code complicated applications is required. The ability to create test cases and scenarios based on specifications. Solid knowledge of SDLC and agile techniques. Knowledge of current technology and trends. Logical thinking and problem-solving abilities, as well as the capacity to collaborate. Primary skills: Cloud Platform, Azure, Databricks, ADF, ADO. Sought: SQL, Python, PowerBI. General Knowledge: PowerApps, Java. 3-5 years of experience in software development with minimum 2 years of cloud computing. Education: Bachelor of Science in Computer Science, Engineering, or related technical field.
Posted 1 week ago
6.0 - 11.0 years
20 - 25 Lacs
Bengaluru
Hybrid
Job Summary: We are seeking a detail-oriented and results-driven Project Manager to lead projects within our Data Platform organization, with a strong focus on Databricks implementations . The ideal candidate will have experience managing end-to-end delivery of cloud-based data platforms and collaborating cross-functionally across data engineering, analytics, DevOps, and business stakeholders. Key Responsibilities: Lead and manage full lifecycle projects related to data platform initiatives, especially Databricks-based solutions across AWS or Azure. Develop and maintain project plans, schedules, budgets, and resource forecasts using tools like Jira, MS Project, or similar. Coordinate across technical teams (data engineering, ML, DevOps) and business units to define scope, deliverables, and success metrics. Facilitate sprint planning, daily stand-ups, retrospectives, and status reporting following Agile/Scrum or hybrid methodologies. Identify risks, dependencies, and blockers early; drive resolution through mitigation plans and stakeholder communication. Manage vendor relationships (where applicable), ensuring delivery quality, alignment with architecture standards, and on-time execution. Ensure compliance with data governance, security, and documentation standards. Communicate regularly with senior leadership on project status, KPIs, and key decisions. Required Qualifications: 5+ years of experience managing technical or data-related projects, with at least 2+ years in cloud data platforms . Proven experience leading projects involving Databricks , Delta Lake , DBT , and distributed data pipelines. Strong knowledge of data lakehouse architecture , data ingestion/ETL, and modern data platforms (AWS, Azure). Solid understanding of Agile delivery practices, change management, and cross-functional coordination. Proficiency in project tracking tools (Jira, Confluence, Smartsheet, or Microsoft Project). Exceptional written and verbal communication skills; able to translate technical concepts to business audiences. Preferred Qualifications: PMP, PMI-ACP, or Certified Scrum Master (CSM) certification. Prior experience in enterprise data modernization or AI/ML-enabling platforms. Prior experience on multi-cloud platforms is preferred Familiarity with tools such as Airflow, Unity Catalog, Power BI/Tableau, and Git-based CI/CD processes. Soft Skills: Strong leadership and stakeholder management Proactive problem solver with a bias for execution Excellent time management and multitasking ability Comfortable working in a fast-paced, evolving environment
Posted 1 week ago
3.0 - 7.0 years
5 - 8 Lacs
Mumbai
Work from Office
Position Summary : At NCR Atleos, our Internal Audit Department (IAD) purpose is to help enable competent and informed decisions to add value and improve operations, while contributing meaningfully to Board and organizational confidence. We are indispensable business partners, with a brand focused on insight, impact and excellence. We believe that everything we do is to enhance value, provide insights, and instill confidence. To do this, we must be relevant, connected, flexible, and courageous. NCR Atleos IAD is seeking a Data Analytics Manager who will play a critical role in enhancing the Internal Audit function through data-driven insights, analytics, and process optimization. This role will report directly to the Executive Director, Internal Audit. Key Areas of Responsibility: Data Analytics Strategy & Execution: Develop and implement data analytics methodologies to support the internal audit function; Design and execute advanced data analysis scripts and models to identify trends, anomalies, and potential risks; Partner with the audit teams to integrate analytics into audit planning, execution, and reporting. Audit Support : Collaborate with the Director of Internal Audit to support audits in the areas of technology, information security, business processes, and financial operations; Extract, analyze, and interpret data from various enterprise systems to support audit objectives; Provide insights that enhance audit outcomes and help identify areas for operational improvement. Data Visualization & Reporting: Create clear, actionable, and visually compelling reports and dashboards to communicate audit findings to stakeholders and the Audit Committee; Develop templates and standards for data analytics in audit work products to ensure consistency and clarity. Collaboration & Training: Work closely with IT, Finance, Operations, and other business units to gather data and validate insights; Mentor and train other Internal Audit team members on leveraging data analytics tools and techniques; Build partnerships across the organization to foster a culture of data-driven decision-making. Technology & Tools: Identify, evaluate, and implement data analytics tools and technologies to improve audit processes; Stay updated on emerging technologies and trends in data analytics and audit methodologies; Support automation initiatives to enhance efficiency within the Internal Audit department. Compliance & Risk Management: Ensure data analytics initiatives align with organizational compliance requirements and internal audit standards; Monitor and evaluate data integrity, system reliability, and process controls across business units. Continuous Improvement: Stay abreast of emerging technologies, audit methodologies, and regulatory changes. Support the Executive Director in overseeing the use of technology within the audit function, including data analytics and audit management software, to enhance audit quality and efficiency. Contribute to innovation and improvements to the IT audit process, controls and the overall Internal Audit Department. Qualifications: Education : Bachelors or masters in computer science, IT, Engineering, Data Science, Econometrics, or related fields. Experience : Proven data analytics experience in internal audit or risk management, with strong analytical, problem-solving, and project management skills. Statistical Methods : Proficient in regressions, time series, clusters, and decision trees. Programming : Skilled in JavaScript, Python, R, PHP, .NET, SQL. Databases : Expertise in relational databases, data warehouses, ETL, UI tools, and query optimization. Visualization : Proficient in Tableau, Power BI, and advanced MS Office skills. Cloud Platforms : Experience with Microsoft Azure, Data Bricks, Hadoop, or similar platforms. Project Management : Experience managing analytics projects and stakeholder management. Communication : Ability to convey complex data insights to non-technical stakeholders. Leadership : Demonstrated leadership and team mentoring skills. Cultural Sensitivity : Ability to work effectively in a global environment. Languages : Proficiency in multiple languages is an advantage. Ethics : High ethical standards and commitment to audit integrity. Confidentiality : Ensuring the security of sensitive data. Team Environment : Positive attitude within a dynamic team setting.
Posted 1 week ago
5.0 - 10.0 years
15 - 30 Lacs
Bengaluru
Remote
Hiring for US based Multinational Company (MNC) We are seeking a skilled and detail-oriented Data Engineer to join our team. In this role, you will design, build, and maintain scalable data pipelines and infrastructure to support business intelligence, analytics, and machine learning initiatives. You will work closely with data scientists, analysts, and software engineers to ensure that high-quality data is readily available and usable. Design and implement scalable, reliable, and efficient data pipelines for processing and transforming large volumes of structured and unstructured data. Build and maintain data architectures including databases, data warehouses, and data lakes. Collaborate with data analysts and scientists to support their data needs and ensure data integrity and consistency. Optimize data systems for performance, cost, and scalability. Implement data quality checks, validation, and monitoring processes. Develop ETL/ELT workflows using modern tools and platforms. Ensure data security and compliance with relevant data protection regulations. Monitor and troubleshoot production data systems and pipelines. Proven experience as a Data Engineer or in a similar role Strong proficiency in SQL and at least one programming language such as Python, Scala, or Java Experience with data pipeline tools such as Apache Airflow, Luigi, or similar Familiarity with modern data platforms and tools: Big Data: Hadoop, Spark Data Warehousing: Snowflake, Redshift, BigQuery, Azure Synapse Databases: PostgreSQL, MySQL, MongoDB Experience with cloud platforms (AWS, Azure, or GCP) Knowledge of data modeling, schema design, and ETL best practices Strong analytical and problem-solving skills
Posted 1 week ago
7.0 - 12.0 years
0 - 0 Lacs
Kochi
Work from Office
Greetings from TCS Recruitment Team! Role: DATABRICKS LEAD/ DATABRICKS SOLUTION ARCHITECT/ DATABRICKS ML ENGINEER Years of experience: 7 to 18 Years Walk-In-Drive Location: Kochi Walk-in-Location Details: Tata Consultancy Services TCS Centre SEZ Unit, Infopark Kochi Phase 1, Infopark Kochi P.O, Kakkanad, Kochi - 682042, Kerala India Drive Time: 9 am to 1:00 PM Date: 21-Jun-25 Must have 5+ years of experience in data engineering or related fields At least 2-3 years of hands-on experience with Databricks (using Apache Spark, Delta Lake, etc.) Solid experience in working with big data technologies such as Hadoop, Spark, Kafka, or similar Experience with cloud platforms (AWS, Azure, or GCP) and cloud-native data tools Experience with machine learning frameworks and pipelines, particularly in Databricks. Experience with AI/ML model deployment, MLOps, and ML lifecycle management using Databricks and related tools.
Posted 1 week ago
6.0 - 11.0 years
17 - 30 Lacs
Hyderabad/Secunderabad, Bangalore/Bengaluru, Delhi / NCR
Hybrid
Inviting applications for the role of Lead Consultant-Data Engineer, Azure+Python! Responsibilities Hands on experience with Azure, pyspark, and Python with Kafka Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the Azure environment, including IAM policies, security groups, and encryption mechanisms. Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on Azure Experience of Databricks will be added advantage Strong experience in Python and SQL Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Masters Degree-Computer Science, Electronics, Electrical. Azure Data Engineering & Cloud certifications, Databricks certifications Experience of working with Oracle ERP Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process
Posted 1 week ago
5.0 - 10.0 years
0 - 1 Lacs
Kolkata, Hyderabad, Pune
Work from Office
•Experience in MongoDB database is important, other databases of SQL Server, PostgreSQL knowledge also required. Require understand & modifying the database objects based on business request. Must have deep understanding of preparing complex queries,
Posted 2 weeks ago
6.0 - 10.0 years
14 - 24 Lacs
Hyderabad
Work from Office
Role & responsibilities Job Title: Data Engineer Years of experience: 6 to 10 years (Minimum 5 years of relevant experience) Work Mode: Work From Office Hyderabad Notice Period-Immediate to 30 Days only Key Skills: Python, SQL, AWS, Spark, Databricks - ( Mandate) Airflow- Good to have
Posted 2 weeks ago
5.0 - 9.0 years
19 - 23 Lacs
Mumbai
Work from Office
Overview MSCI has an immediate opening in of our fastest growing product lines. As a Lead Architect within Sustainability and Climate, you are an integral part of a team that works to develop high-quality architecture solutions for various software applications on modern cloud-based technologies. As a core technical contributor, you are responsible for conducting critical architecture solutions across multiple technical areas to support project goals. The systems under your responsibility will be amongst the most mission critical systems of MSCI. They require strong technology expertise and a strong sense of enterprise system design, state-of-the-art scalability and reliability but also innovation. Your ability to take technology decisions in a consistent framework to support the growth of our company and products, lead the various software implementations in close partnerships with global leaders and multiple product organizations and drive the technology innovations will be the key measures of your success in our dynamic and rapidly growing environment. At MSCI, you will be operating in a culture where we value merit and track record. You will own the full life-cycle of the technology services and provide management, technical and people leadership in the design, development, quality assurance and maintenance of our production systems, making sure we continue to scale our great franchise. Responsibilities Engages technical teams and business stakeholders to discuss and propose technical approaches to meet current and future needs • Defines the technical target state of the product and drives achievement of the strategy • As the Lead Architect you will be responsible for leading the design, development, and maintenance of our data architecture, ensuring scalability, efficiency, and reliability. • Create and maintain comprehensive documentation for the architecture, processes, and best practices including Architecture Decision Records (ADRs). • Evaluates recommendations and provides feedback on new technologies • Develops secure and high-quality production code, and reviews and debugs code written by others Informaon Classificaon: GENERAL • Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems • Collaborating with a cross functional team to draft, implement and adapt the overall architecture of our products and support infrastructure in conjunction with software development managers, and product management teams • Staying abreast of new technologies and issues in the software-as-a-service industry, including current technologies, platforms, standards and methodologies • Being actively engaged in setting technology standards that impact the company and its offerings • Ensuring the knowledge sharing of engineering best practices across departments; and developing and monitoring technical standards to ensure adherence to them. Qualifications Prior senior Software Architecture roles • Demonstrate proficiency in programming languages such as Python/Java/Scala and knowledge of SQL and NoSQL databases. • Drive the development of conceptual, logical, and physical data models aligned with business requirements. • Lead the implementation and optimization of data technologies, including Apache Spark. • Experience with one of the table formats, such as Delta, Iceberg. • Strong hands-on experience in data architecture, database design, and data modeling. • Proven experience as a Data Platform Architect or in a similar role, with expertise in Airflow, Databricks, Snowflake, Collibra, and Dremio. • Experience with cloud platforms such as AWS, Azure, or Google Cloud. • Ability to dive into details, hands on technologist with strong core computer science fundamentals. • Strong preference for financial services experience • Proven leadership of large-scale distributed software teams that have delivered great products on deadline • Experience in a modern iterative software development methodology • Experience with globally distributed teams and business partners • Experience in building and maintaining applications that are mission critical for customers • M.S. in Computer Science, Management Information Systems or related engineering field • 15+ years of software engineering experience • Demonstrated consensus builder and collegial peer What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com
Posted 2 weeks ago
5.0 - 10.0 years
9 - 18 Lacs
Coimbatore
Hybrid
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Azure Analytics Services Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : BEBTECHMTECH Summary: As an Application Lead, you will be responsible for leading the effort to design, build and configure applications, acting as the primary point of contact. Your typical day will involve working with Microsoft Azure Analytics Services and collaborating with cross-functional teams to deliver high-quality solutions. Roles & Responsibilities: - Lead the design, development, and deployment of applications using Microsoft Azure Analytics Services. - Collaborate with cross-functional teams to ensure the timely delivery of high-quality solutions. - Act as the primary point of contact for all application-related issues, providing technical guidance and support to team members. - Ensure adherence to best practices and standards for application development, testing, and deployment. - Identify and mitigate risks and issues related to application development and deployment. Professional & Technical Skills: - Must To Have Skills: Strong experience in Microsoft Azure Analytics Services. - Good To Have Skills: Experience in other Microsoft Azure services such as Azure Functions, Azure Logic Apps, and Azure Event Grid. - Experience in designing, developing, and deploying applications using Microsoft Azure Analytics Services. - Strong understanding of cloud computing concepts and principles. - Experience in working with Agile methodologies. - Excellent problem-solving and analytical skills. Additional Information: - The candidate should have a minimum of 5 years of experience in Microsoft Azure Analytics Services. - The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering high-quality solutions.
Posted 2 weeks ago
10.0 - 20.0 years
25 - 40 Lacs
Hyderabad, Pune
Hybrid
Data Modeler / Lead - Healthcare Data Systems Position Overview We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. Key Responsibilities Data Architecture & Modeling • Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management • • • Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment) Create and maintain data lineage documentation and data dictionaries for healthcare datasets Establish data modeling standards and best practices across the organization Technical Leadership • • • Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica Architect scalable data solutions that handle large volumes of healthcare transactional data Collaborate with data engineers to optimize data pipelines and ensure data quality Healthcare Domain Expertise • Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI) • • Design data models that support analytical, reporting and AI/ML needs Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations • Partner with business stakeholders to translate healthcare business requirements into technical data solutions Data Governance & Quality • • • Implement data governance frameworks specic to healthcare data privacy and security requirements Establish data quality monitoring and validation processes for critical health plan metrics Lead eorts to standardize healthcare data denitions across multiple systems and data sources Required Qualications Technical Skills • • • • • 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data Expert-level prociency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks) Prociency with data modeling tools (Hackolade, ERwin, or similar) Healthcare Industry Knowledge • Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data • • Experience with healthcare data standards and medical coding systems Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment) • Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI) Leadership & Communication • • • • Proven track record of leading data modeling projects in complex healthcare environments Strong analytical and problem-solving skills with ability to work with ambiguous requirements Excellent communication skills with ability to explain technical concepts to business stakeholders Experience mentoring team members and establishing technical standards Preferred Qualications • Experience with Medicare Advantage, Medicaid, or Commercial health plan operations • • • • Cloud platform certications (AWS, Azure, or GCP) Experience with real-time data streaming and modern data lake architectures Knowledge of machine learning applications in healthcare analytics Previous experience in a lead or architect role within healthcare organizations
Posted 2 weeks ago
4.0 - 9.0 years
1 - 2 Lacs
Kolkata, Pune, Chennai
Hybrid
Role & responsibilities: Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Preferred candidate profile: Bachelor's and/or masters degree in computer science or equivalent experience. Must have total 3+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL , Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail.
Posted 2 weeks ago
4.0 - 8.0 years
6 - 10 Lacs
Pune, Gurugram
Work from Office
ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it , our most valuable asset is our people. Here you ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage an d passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. Business Technology ZS s Technology group focuses on scalable strategies, assets and accelerators that deliver to our clients enterprise-wide transformation via cutting-edge technology. We leverage digital and technology solutions to optimize business processes, enhance decision-making, and drive innovation. Our services include, but are not limited to, Digital and Technology advisory, Product and Platform development and Data, Analytics and AI implementation. What you ll do Undertake complete ownership in accomplishing activities and assigned responsibilities across all phases of project lifecycle to solve business problems across one or more client engagements; Apply appropriate development methodologies (e.g.agile, waterfall) and best practices (e.g. mid-development client reviews, embedded QA procedures, unit testing) to ensure successful and timely completion of assignments; Collaborate with other team members to leverage expertise and ensure seamless transitions; Exhibit flexibility in undertaking new and challenging problems and demonstrate excellent task management; Assist in creating project outputs such as business case development, solution vision and design, user requirements, prototypes, and technical architecture (if needed), test cases, and operations management; Bring transparency in driving assigned tasks to completion and report accurate status; Bring Consulting mindset in problem solving, innovation by leveraging technical and business knowledge/ expertise and collaborate across other teams; Assist senior team members, delivery leads in project management responsibilities What you ll bring Big Data TechnologiesProficiency in working with big data technologies, particularly in the context of Azure Databricks, which may include Apache Spark for distributed data processing. Azure DatabricksIn-depth knowledge of Azure Databricks for data engineering tasks, including data transformations, ETL processes, and job scheduling. SQL and Query OptimizationStrong SQL skills for data manipulation and retrieval, along with the ability to optimize queries for performance in Snowflake. ETL (Extract, Transform, Load)Expertise in designing and implementing ETL processes to move and transform data between systems, utilizing tools and frameworks available in Azure Databricks. Data IntegrationExperience with integrating diverse data sources into a cohesive and usable format, ensuring data quality and integrity. Python/PySparkKnowledge of programming languages like Python and PySpark for scripting and extending the functionality of Azure Databricks notebooks. Version ControlFamiliarity with version control systems, such as Git, for managing code and configurations in a collaborative environment. Monitoring and OptimizationAbility to monitor data pipelines, identify bottlenecks, and optimize performance for both Azure Data Factory Security and ComplianceUnderstanding of security best practices and compliance considerations when working with sensitive data in Azure and Snowflake environments. Snowflake Data WarehouseExperience in designing, implementing, and optimizing data warehouses using Snowflake, including schema design, performance tuning, and query optimization. Healthcare Domain Knowledge: Familiarity with US health plan terminologies and datasets is essential. Programming/Scripting Languages: Proficiency in Python, SQL, and PySpark is required. Cloud Platforms: Experience with AWS or Azure, specifically in building data pipelines, is needed. Cloud-Based Data Platforms: Working knowledge of Snowflake and Databricks is preferred. Data Pipeline Orchestration: Experience with Azure Data Factory and AWS Glue for orchestrating data pipelines is necessary. Relational Databases: Competency with relational databases such as PostgreSQL and MySQL is required, while experience with NoSQL databases is a plus. BI Tools: Knowledge of BI tools such as Tableau and PowerBI is expected. Version Control: Proficiency with Git, including branching, merging, and pull requests, is required. CI/CD for Data Pipelines: Experience in implementing continuous integration and delivery for data workflows using tools like Azure DevOps is essential. Additional Skills Experience with front-end technologies such as SQL, JavaScript, HTML, CSS, and Angular is advantageous. Familiarity with web development frameworks like Flask, Django, and FAST API is beneficial. Basic knowledge of AWS CI/CD practices is a plus. Strong verbal and written communication skills with ability to articulate results and issues to internal and client teams; Proven ability to work creatively and analytically in a problem-solving environment; Willingness to travel to other global offices as needed to work with client or other internal project teams. Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered.
Posted 2 weeks ago
2.0 - 4.0 years
4 - 6 Lacs
Hyderabad
Work from Office
Overview Data Science Team works in developing Machine Learning (ML) and Artificial Intelligence (AI) projects. Specific scope of this role is to develop ML solution in support of ML/AI projects using big analytics toolsets in a CI/CD environment. Analytics toolsets may include DS tools/Spark/Databricks, and other technologies offered by Microsoft Azure or open-source toolsets. This role will also help automate the end-to-end cycle with Azure Pipelines. You will be part of a collaborative interdisciplinary team around data, where you will be responsible of our continuous delivery of statistical/ML models. You will work closely with process owners, product owners and final business users. This will provide you the correct visibility and understanding of criticality of your developments. Responsibilities Delivery of key Advanced Analytics/Data Science projects within time and budget, particularly around DevOps/MLOps and Machine Learning models in scope Active contributor to code & development in projects and services Partner with data engineers to ensure data access for discovery and proper data is prepared for model consumption. Partner with ML engineers working on industrialization. Communicate with business stakeholders in the process of service design, training and knowledge transfer. Support large-scale experimentation and build data-driven models. Refine requirements into modelling problems. Influence product teams through data-based recommendations. Research in state-of-the-art methodologies. Create documentation for learnings and knowledge transfer. Create reusable packages or libraries. Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards Leverage big data technologies to help process data and build scaled data pipelines (batch to real time) Implement end-to-end ML lifecycle with Azure Databricks and Azure Pipelines Automate ML models deployments Qualifications BE/B.Tech in Computer Science, Maths, technical fields. Overall 2-4 years of experience working as a Data Scientist. 2+ years experience building solutions in the commercial or in the supply chain space. 2+ years working in a team to deliver production level analytic solutions. Fluent in git (version control). Understanding of Jenkins, Docker are a plus. Fluent in SQL syntaxis. 2+ years experience in Statistical/ML techniques to solve supervised (regression, classification) and unsupervised problems. 2+ years experience in developing business problem related statistical/ML modeling with industry tools with primary focus on Python or Pyspark development. Data Science Hands on experience and strong knowledge of building machine learning models supervised and unsupervised models. Knowledge of Time series/Demand Forecast models is a plus Programming Skills Hands-on experience in statistical programming languages like Python, Pyspark and database query languages like SQL Statistics Good applied statistical skills, including knowledge of statistical tests, distributions, regression, maximum likelihood estimators Cloud (Azure) Experience in Databricks and ADF is desirable Familiarity with Spark, Hive, Pig is an added advantage Business storytelling and communicating data insights in business consumable format. Fluent in one Visualization tool. Strong communications and organizational skills with the ability to deal with ambiguity while juggling multiple priorities Experience with Agile methodology for team work and analytics product creation. Experience in Reinforcement Learning is a plus. Experience in Simulation and Optimization problems in any space is a plus. Experience with Bayesian methods is a plus. Experience with Causal inference is a plus. Experience with NLP is a plus. Experience with Responsible AI is a plus. Experience with distributed machine learning is a plus Experience in DevOps, hands-on experience with one or more cloud service providers AWS, GCP, Azure(preferred) Model deployment experience is a plus Experience with version control systems like GitHub and CI/CD tools Experience in Exploratory data Analysis Knowledge of ML Ops / DevOps and deploying ML models is preferred Experience using MLFlow, Kubeflow etc. will be preferred Experience executing and contributing to ML OPS automation infrastructure is good to have Exceptional analytical and problem-solving skills Stakeholder engagement-BU, Vendors. Experience building statistical models in the Retail or Supply chain space is a plus
Posted 2 weeks ago
3.0 - 7.0 years
6 - 10 Lacs
Mumbai
Work from Office
Senior Azure Data Engineer ? L1 Support
Posted 2 weeks ago
7.0 - 12.0 years
13 - 23 Lacs
Pune, Bengaluru, Mumbai (All Areas)
Hybrid
We are eagerly seeking candidates with 5 to 13 years experience for a Data Engineer / Lead, to join our dynamic team. The ideal candidate will play a pivotal role within the team to who is a skilled professional with exposure to Python, Spark, Hive, AWS. You will collaborate with internal teams to design, develop, deploy, and maintain software applications at scale Role: Data Engineer / Lead Location: PAN India Experience: 5 to 13 years Job type: Full time Work type: Hybrid Data Engineer with minimum 5 years of relevant professional experience • Should have expertise in Python Scripting and Big data technologies like Spark, Hive, Presto etc. • Experience with AWS services – IAM, EC2, S3, EMR, Lambda Functions, Step Functions, CloudWatch, Redshift, Athena, GLUE etc. • Hands-on experience with Databricks. • Proficient writing Spark jobs in Pyspark and Scala. • Experience writing queries with both SQL and NoSQL DB ((Hive, HBase, MongoDB, Elasticsearch, PostgreSQL etc.) • Should have good understanding around python data structures including data frames, datasets, RDD’s etc. • Experience in ML – integration of ML models • Experience with Data profiling, data migration. • Developing Hive UDF and Hive jobs • Proven hands-on Software Development experience • Experience with test-driven development • Preferred experience in the insurance domain • Must have good understanding of Data warehousing concepts. • Experience using CI/CD tools like GitHub Actions, Jenkins, Azure Devops etc. • Experienced working in Agile projects – Sprint planning, grooming and providing estimations. • Experience using JIRA, Confluence, VS Code or similar IDE’s, Jupyter notebooks etc. • Good communication and collaborative skills with internal and external teams • Flexibility and ability to work in onshore/offshore model involving multiple agile teams • Mentor and guide junior developers, review code, familiar with estimation techniques using story points • Strong analytical and problem-solving skills Qualification you must require: Bachelors or master’s with Computer Science or related field
Posted 2 weeks ago
5.0 - 8.0 years
15 - 25 Lacs
Gurugram, Bengaluru
Hybrid
Warm Greetings from SP Staffing!! Role :Azure Data Engineer Experience Required :5 to 8 yrs Work Location : Bangalore/Gurgaon Required Skills, Azure Databricks, ADF, Pyspark/SQL Interested candidates can send resumes to nandhini.spstaffing@gmail.com
Posted 2 weeks ago
6.0 - 8.0 years
0 - 1 Lacs
Hyderabad
Hybrid
ML Engineer | RAG, LLM, AWS, Databricks | 6–8 Yrs Exp | Build scalable ML systems with GenAI, pipelines & cloud integration
Posted 2 weeks ago
12.0 - 22.0 years
8 - 18 Lacs
Pune, Bengaluru
Hybrid
Role & responsibilities Understanding of the business area that the project is involved with. Working with data stewards to understand the data sources. Clear understanding of data entities, relationships, cardinality etc for the inbound sources based on inputs from the data stewards / source system experts. Performance tuning understanding the overall requirement, reporting impact. Data Modeling for the business and reporting models as per the reporting needs or delivery needs to other downstream systems. Have experience to components and languages like Databricks, Python, PySpark, SCALA, R. Ability to ask strong questions to help the team see areas that may lead to problems. Ability to validate the data by writing sql queries and compare against the source system and transformation mapping. Work closely with teams to collect and translate information requirements into data to develop data-centric solutions. Ensure that industry-accepted data architecture principles and standards are integrated and followed for modeling, stored procedures, replication, regulations, and security, among other concepts, to meet technical and business goals. Continuously improve the quality, consistency, accessibility, and security of our data activity across company needs. Experience on Azure DevOps project tracking tool or equivalent tools like JIRA. Should have Outstanding verbal, non-verbal communication. Should have experience and desire to work in a Global delivery environment.
Posted 2 weeks ago
9.0 - 14.0 years
9 - 24 Lacs
Visakhapatnam
Work from Office
Responsibilities: * Design, develop & maintain data pipelines using PySpark, SQL & DBs. * Collaborate with cross-functional teams on project delivery. *Strong in Databricks, PySpark, SQL * Databricks certification is mandatory *Location: Remote
Posted 2 weeks ago
10.0 - 15.0 years
22 - 37 Lacs
Bengaluru
Work from Office
Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. As GCP Data Engineer at Kyndryl, you will be responsible for designing and developing data pipelines, participating in architectural discussions, and implementing data solutions in a cloud environment using GCP data services. You will collaborate with global architects and business teams to design and deploy innovative solutions, supporting data analytics, automation, and transformation needs. Responsibilities: Design, develop, and maintain scalable data pipelines using GCP services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage. Participate in architectural discussions, conduct system analysis, and suggest optimal solutions that are scalable, future-proof, and aligned with business requirements. Collaborate with stakeholders to gather requirements and create high-level and detailed technical designs. Design data models suitable for both transactional and big data environments, supporting Machine Learning workflows. Build and optimize ETL/ELT infrastructure using a variety of data sources and GCP services. Develop and maintain Python / PySpark for data processing and integrate with GCP services for seamless data operations. Develop and optimize SQL queries for data analysis and reporting. Monitor and troubleshoot data pipeline issues to ensure timely resolution. Implement data governance and security best practices within GCP. Perform data quality checks and validation to ensure accuracy and consistency. Support DevOps automation efforts to ensure smooth integration and deployment of data pipelines. Provide design expertise in Master Data Management (MDM), Data Quality, and Metadata Management. Provide technical support and guidance to junior data engineers and other team members. Participate in code reviews and contribute to continuous improvement of data engineering practices. Implement best practices for cost management and resource utilization within GCP. If you're ready to embrace the power of data to transform our business and embark on an epic data adventure, then join us at Kyndryl. Together, let's redefine what's possible and unleash your potential. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Technical and Professional Experience: Bachelor’s or master’s degree in computer science, Engineering, or a related field with over 8 years of experience in data engineering More than 3 years of experience with the GCP data ecosystem Hands-on experience and Strong proficiency in GCP components such as Dataflow, Dataproc, BigQuery, Cloud Functions, Composer, Data Fusion. Excellent command of SQL with the ability to write complex queries and perform advanced data transformation. Strong programming skills in PySpark and/or Python, specifically for building cloud-native data pipelines. Familiarity with GCP tools like Looker, Airflow DAGs, Data Studio, App Maker, etc. Hands-on experience implementing enterprise-wide cloud data lake and data warehouse solutions on GCP. Knowledge of data governance, security, and compliance best practices. Experience with private and public cloud architectures, pros/cons, and migration considerations. Excellent problem-solving, analytical, and critical thinking skills. Ability to manage multiple projects simultaneously, while maintaining a high level of attention to detail. Communication Skills: Must be able to communicate with both technical and nontechnical. Able to derive technical requirements with the stakeholders. Ability to work independently and in agile teams. Preferred Technical And Professional Experience GCP Data Engineer Certification is highly preferred. Professional certification, e.g., Open Certified Technical Specialist with Data Engineering Specialization. Experience working as a Data Engineer and/or in cloud modernization. Knowledge of Databricks, Snowflake, for data analytics. Experience in NoSQL databases Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes). Familiarity with BI dashboards and Google Data Studio is a plus. Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.
Posted 2 weeks ago
5.0 - 9.0 years
14 - 24 Lacs
Bengaluru
Work from Office
Job Title: Senior Data Engineer Azure Location: Bengaluru Experience: 6+ years (3+ years on Azure data services preferred) Department: Data Engineering / IT Roles and Responsibilities: The Data Engineer will work on data engineering projects for various business units, focusing on delivery of complex data management solutions by leveraging industry best practices. They work with the project team to build the most efficient data pipelines and data management solutions that make data easily available for consuming applications and analytical solutions. A Data engineer is expected to possess strong technical skills. Key Characteristics Technology champion who constantly pursues skill enhancement and has inherent curiosity to understand work from multiple dimensions. Interest and passion in Big Data technologies and appreciates the value that can be brought in with an effective data management solution. Has worked on real data challenges and handled high volume, velocity, and variety of data. Excellent analytical & problem-solving skills, willingness to take ownership and resolve technical challenges. Contributes to community building initiatives like CoE, CoP. Mandatory skills: Azure - Master ELT - Skill Data Modeling - Skill Data Integration & Ingestion - Skill Data Manipulation and Processing - Skill GITHUB, Action, Azure DevOps - Skill Data factory, Databricks, SQL DB, Synapse, Stream Analytics, Glue, Airflow, Kinesis, Redshift, SonarQube, PyTest - Skill Optional skills: Experience in project management, running a scrum team. Experience working with BPC, Planning. Exposure to working with external technical ecosystem. MKDocs documentation
Posted 2 weeks ago
6.0 - 10.0 years
15 - 20 Lacs
Pune, Bengaluru, Delhi / NCR
Work from Office
What impact will you make? Every day, your work will make an impact that matters, while you thrive in a dynamic culture of inclusion, collaboration and high performance. As the undisputed leader in professional services, Deloitte is where you will find unrivaled opportunities to succeed and realize your full potential The Team Deloittes AI&D practice can help you uncover and unlock the value buried deep inside vast amounts of data. Our global network provides strategic guidance and implementation services to help companies manage data from disparate sources and convert it into accurate, actionable information that can support fact-driven decision-making and generate an insight-driven advantage. Our practice addresses the continuum of opportunities in business intelligence & visualization, data management, performance management and next-generation analytics and technologies, including big data, cloud, cognitive and machine learning. Work youll do Location: Bangalore/Mumbai/Pune/Delhi/Chennai/Hyderabad/Kolkata Roles: Databricks Data Engineering Senior Consultant We are seeking highly skilled Databricks Data Engineers to join our data modernization team. You will play a pivotal role in designing, developing, and maintaining robust data solutions on the Databricks platform. Your experience in data engineering, along with a deep understanding of Databricks, will be instrumental in building solutions to drive data-driven decision-making across a variety of customers. Mandatory Skills: Databricks, Spark, Python / SQL Responsibilities • Design, develop, and optimize data workflows and notebooks using Databricks to ingest, transform, and load data from various sources into the data lake. • Build and maintain scalable and efficient data processing workflows using Spark (PySpark or Spark SQL) by following coding standards and best practices. • Collaborate with technical and business stakeholders to understand data requirements and translate them into technical solutions. • Develop data models and schemas to support reporting and analytics needs. • Ensure data quality, integrity, and security by implementing appropriate checks and controls. • Monitor and optimize data processing performance, identifying, and resolving bottlenecks. • Stay up to date with the latest advancements in data engineering and Databricks technologies. Qualifications • Bachelors or masters degree in any field • 6-10 years of experience in designing, implementing, and maintaining data solutions on Databricks • Experience with at least one of the popular cloud platforms – Azure, AWS or GCP • Experience with ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) processes • Knowledge of data warehousing and data modelling concepts • Experience with Python or SQL • Experience with Delta Lake • Understanding of DevOps principles and practices • Excellent problem-solving and troubleshooting skills • Strong communication and teamwork skills Your role as a leader At Deloitte India, we believe in the importance of leadership at all levels. We expect our people to embrace and live our purpose by challenging themselves to identify issues that are most important for our clients, our people, and for society and make an impact that matters. In addition to living our purpose, Senior Consultant across our organization: Develop high-performing people and teams through challenging and meaningful opportunities Deliver exceptional client service; maximize results and drive high performance from people while fostering collaboration across businesses and borders Influence clients, teams, and individuals positively, leading by example and establishing confident relationships with increasingly senior people Understand key objectives for clients and Deloitte; align people to objectives and set priorities and direction. Acts as a role model, embracing and living our purpose and values, and recognizing others for the impact they make How you will grow At Deloitte, our professional development plan focuses on helping people at every level of their career to identify and use their strengths to do their best work every day. From entry-level employees to senior leaders, we believe there is always room to learn. We offer opportunities to help build excellent skills in addition to hands-on experience in the global, fast-changing business world. From on-the-job learning experiences to formal development programs at Deloitte University, our professionals have a variety of opportunities to continue to grow throughout their career. Explore Deloitte University, The Leadership Centre. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our purpose Deloitte is led by a purpose: To make an impact that matters . Every day, Deloitte people are making a real impact in the places they live and work. We pride ourselves on doing not only what is good for clients, but also what is good for our people and the Communities in which we live and work—always striving to be an organization that is held up as a role model of quality, integrity, and positive change. Learn more about Deloitte's impact on the world Recruiter tips We want job seekers exploring opportunities at Deloitte to feel prepared and confident. To help you with your interview, we suggest that you do your research: know some background about the organization and the business area you are applying to. Check out recruiting tips from Deloitte professionals.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane