Home
Jobs

3417 Databricks Jobs - Page 9

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 years

0 Lacs

Greater Kolkata Area

Remote

Linkedin logo

Location : Remote/Hybrid (As applicable) Experience : 4-7 Years Type : Full-time | Contractual Job Description We are seeking a talented and experienced Senior Software Engineer to join our dynamic product team. You will be responsible for designing, developing, and maintaining our cloud-based Software-as-a-Service (SaaS) platform. This role involves taking ownership of backend architecture and ensuring our systems are scalable, high-performing, and fault-tolerant. You will work closely with product managers, frontend engineers, and DevOps to deliver innovative and reliable technical solutions. Responsibilities Translate product requirements into high-level system designs and implementation plans. Develop scalable, maintainable, and high-performance backend solutions using Java, Python, C++ and relevant technologies. Design and implement distributed systems and data processing pipelines to support SaaS workloads. Optimize systems for performance, cost-efficiency, and scalability. Leverage cloud platforms (AWS, GCP, Azure) and cloud data warehouses (Snowflake, BigQuery, Databricks). Implement containerization and orchestration strategies using Docker and Kubernetes. Work with both SQL and NoSQL databases to build robust data layers. Mentor junior engineers and participate in onboarding processes. Stay current with industry trends and best practices in software engineering and distributed systems. Requirements Bachelor's or Master's degree in Computer Science, Software Engineering, or a related field. 47 years of experience in backend development, with strong hands-on coding in Java, C++, or Python. Proven experience with cloud data warehouses (Snowflake, BigQuery, Databricks). Solid understanding of cloud platforms (AWS, Azure, GCP). Hands-on experience with Docker, Kubernetes, and microservices architecture. Familiarity with CI/CD pipelines, version control (Git), and DevOps practices. Strong understanding of database systems (both SQL and NoSQL). Experience with SaaS platform development and large-scale distributed systems. Contributions to open-source projects or an active presence in the tech community is a plus. (ref:hirist.tech) Show more Show less

Posted 3 days ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Company Description Cyanous is a leading global information technology, consulting, and business process services company. Our mission is to empower every individual and organization to achieve more and adapt to the digital world. We leverage cognitive computing, hyper-automation, robotics, cloud, analytics, and emerging technologies to drive transformation and success for our clients. Dedicated to addressing global challenges, we collaborate with employees, clients, partners, public institutions, and community organizations globally. We Are Description This is a full-time role for a Big Data Developer based on-site in Chennai. Responsibilities The Big Data Developer will be responsible for designing, developing, and managing data processing systems. This includes working on data integration, Extract Transform Load (ETL) processes, and ensuring data accuracy and integrity. The role also involves collaborating with cross-functional teams to deliver analytics solutions and continuously improve existing data : Proficiency in Data Engineering, Big Data technologies. Experience with Extract Transform Load (ETL) processes and Data Warehousing. Strong background in Software Development. Excellent problem-solving and analytical skills. Ability to work collaboratively with cross-functional teams. Bachelor's degree in Computer Science, Information Technology, or a related field. Experience in the IT consulting industry is a Have : Minimum 8 years of experience in Spark, Scala, and Big Data with exposure about cloud platforms ( AWS, Azure, GCP) for big data processing and storage. Strong experience in Azure DLS. Strong experience in Databricks, data pipelines. Experience in Hadoop. Seeking someone with strong backend development expertise, particularly in Java(Spring to Have : Agile delivery experience. (ref:hirist.tech) Show more Show less

Posted 3 days ago

Apply

5.0 - 10.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Key Responsibilities & Skillsets: 5-10 years of relevant DE experience Azure Databricks– ability to create data transformation logics Strong programming skills in Python and experience with SQL. Should be able to write complex SQL, Transact SQL, Stored Procedures ETL tool – Azure data factory, Data Bricks Experience with Data Modelling DWH - Snowflake Excellent communication skills and stakeholder management Ability to work independently in IC role Good to have Knowledge on Snowflake platform Knowledge on PowerBI Familiarity with CI/CD practices and version control (e.g., Git) Familiarity with Azure DevOps Show more Show less

Posted 3 days ago

Apply

3.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

About Gartner IT Join a world-class team of skilled engineers who build creative digital solutions to support our colleagues and clients. We make a broad organizational impact by delivering cutting-edge technology solutions that power Gartner. Gartner IT values its culture of nonstop innovation, an outcome-driven approach to success, and the notion that great ideas can come from anyone on the team. About The Role Senior Data engineer for production support who will provide daily end-to-end support for daily data loads & manage production issues. What Will You Do Monitor & support various data loads for our Enterprise Data Warehouse. Support business users who are accessing POWER BI dashboards & Datawarehouse tables. Handle incidents, service requests within defined SLA’s. Work with team on managing Azure resources including but not limited to Databricks, Azure Data Factory pipelines, ADLS etc. Build new ETL/ELT pipelines using Azure Data Products like Azure Data Factory, Databricks etc. Help build best practices & processes. Coordinate with upstream/downstream teams to resolve data issues. Work with the QA team and Dev team to ensure appropriate automated regressions are added to detect such issues in future. Work with the Dev team to improve automated error handling so manual interventions can be reduced. Analyze process and pattern so other similar unreported issues can be resolved in one go. What You Will Need Strong IT professional with 3-4 years of experience in Data Engineering. The candidate should have strong analytical and problem-solving skills. Must Have 3-4 years of experience in Data warehouse design & development and ETL using Azure Data Factory (ADF) Experience in writing complex TSQL procedures on MPP platforms - Synapse, Snowflake etc. Experience in analyzing complex code to troubleshoot failure and where applicable recommend best practices around error handling, performance tuning etc. Ability to work independently, as well as part of a team and experience working with fast-paced operations/dev teams. Good understanding of business process and analyzing underlying data Understanding of dimensional and relational modelling Detailed oriented, with the ability to plan, prioritize, and meet deadlines in a fast-paced environment. Can be added to SDE Knowledge of Azure cloud technologies Exceptional problem-solving skills Nice To Have Experience crafting, building, and deploying applications in a DevOps environment utilizing CI/CD tools Understanding of dimensional and relational modeling Relevant certifications Basic knowledge of Power BI. Who Are You Bachelor’s degree or foreign equivalent degree in Computer Science or a related field required Excellent communication skills. Able to work independently or within a team proactively in a fast-paced AGILE-SCRUM environment. Owns success – Takes responsibility for the successful delivery of the solutions. Strong desire to improve upon their skills in tools and technologies Don’t meet every single requirement? We encourage you to apply anyway. You might just be the right candidate for this, or other roles. Who are we? At Gartner, Inc. (NYSE:IT), we guide the leaders who shape the world. Our mission relies on expert analysis and bold ideas to deliver actionable, objective insight, helping enterprise leaders and their teams succeed with their mission-critical priorities. Since our founding in 1979, we’ve grown to more than 21,000 associates globally who support ~14,000 client enterprises in ~90 countries and territories. We do important, interesting and substantive work that matters. That’s why we hire associates with the intellectual curiosity, energy and drive to want to make a difference. The bar is unapologetically high. So is the impact you can have here. What makes Gartner a great place to work? Our sustained success creates limitless opportunities for you to grow professionally and flourish personally. We have a vast, virtually untapped market potential ahead of us, providing you with an exciting trajectory long into the future. How far you go is driven by your passion and performance. We hire remarkable people who collaborate and win as a team. Together, our singular, unifying goal is to deliver results for our clients. Our teams are inclusive and composed of individuals from different geographies, cultures, religions, ethnicities, races, genders, sexual orientations, abilities and generations. We invest in great leaders who bring out the best in you and the company, enabling us to multiply our impact and results. This is why, year after year, we are recognized worldwide as a great place to work . What do we offer? Gartner offers world-class benefits, highly competitive compensation and disproportionate rewards for top performers. In our hybrid work environment, we provide the flexibility and support for you to thrive — working virtually when it's productive to do so and getting together with colleagues in a vibrant community that is purposeful, engaging and inspiring. Ready to grow your career with Gartner? Join us. The policy of Gartner is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to seek to advance the principles of equal employment opportunity. Gartner is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company’s career webpage as a result of your disability. You may request reasonable accommodations by calling Human Resources at +1 (203) 964-0096 or by sending an email to ApplicantAccommodations@gartner.com . Job Requisition ID:99740 By submitting your information and application, you confirm that you have read and agree to the country or regional recruitment notice linked below applicable to your place of residence. Gartner Applicant Privacy Link: https://jobs.gartner.com/applicant-privacy-policy For efficient navigation through the application, please only use the back button within the application, not the back arrow within your browser. Show more Show less

Posted 3 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Description Intern- Data Solutions As an Intern- Data Solutions , you will be part of the Commercial Data Solutions team, providing technical/data expertise development of analytical data products to enable data science & analytics use cases. In this role, you will create and maintain data assets/domains used in the commercial/marketing analytics space – to develop best-in-class data pipelines and products, working closely with data product owners to translate data product requirements and user stories into development activities throughout all phases of design, planning, execution, testing, deployment and delivery. Your Specific Responsibilities Will Include Hands-on development of last-mile data products using the most up-to-date technologies and software / data / DevOps engineering practices Enable data science & analytics teams to drive data modeling and feature engineering activities aligned with business questions and utilizing datasets in an optimal way Develop deep domain expertise and business acumen to ensure that all specificalities and pitfalls of data sources are accounted for Build data products based on automated data models, aligned with use case requirements, and advise data scientists, analysts and visualization developers on how to use these data models Develop analytical data products for reusability, governance and compliance by design Align with organization strategy and implement semantic layer for analytics data products Support data stewards and other engineers in maintaining data catalogs, data quality measures and governance frameworks Education B.Tech / B.S., M.Tech / M.S. or PhD in Engineering, Computer Science, Engineering, Pharmaceuticals, Healthcare, Data Science, Business, or related field Required Experience High proficiency in SQL, Python and AWS Good understanding and comprehension of the requirements provided by Data Product Owner and Lead Analytics Engineer Understanding in creating / adopting data models to meet requirements from Marketing, Data Science, Visualization stakeholders Experience with including feature engineering Hands on with cloud-based (AWS / GCP / Azure) data management platforms and typical storage/compute services (Databricks, Snowflake, Redshift, etc.) Hands on with modern data stack tools such as Matillion, Starburst, ThoughtSpot and low-code tools (e.g. Dataiku) Excellent interpersonal and communication skills, with the ability to quickly establish productive working relationships with a variety of stakeholders Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Intern/Co-op (Fixed Term) Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business Intelligence (BI), Data Management, Data Modeling, Data Visualization, Measurement Analysis, Stakeholder Relationship Management, Waterfall Model Preferred Skills Job Posting End Date 06/16/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R344334 Show more Show less

Posted 3 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

I am thrilled to share an exciting opportunity with one of our esteemed clients! 🚀 Join me in exploring new horizons and unlocking potential. If you're ready for a challenge and growth,. Exp: 7+yrs Location: Chennai, Hyderabad Immediate joiner only, WFO Mandatory skills: SQL, Python, Pyspark, Databricks (strong in core databricks), AWS (AWS is mandate) JD: Manage and optimize cloud infrastructure (AWS, Databricks) for data storage, processing, and compute resources, ensuring seamless data operations. Implement data quality checks, validation rules, and transformation logic to ensure the accuracy, consistency, and reliability of data. Integrate data from multiple sources, ensuring data is accurately transformed and stored in optimal formats (e.g., Delta Lake, Redshift, S3). Automate data workflows using tools like Airflow, Databricks APIs, and other orchestration technologies to streamline data ingestion, processing, and reporting tasks. Regards R Usha usha@livecjobs.com Show more Show less

Posted 3 days ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description You are a strategic thinker passionate about driving solutions in External Reporting . You have found the right team. As an External Reporting Associate in our Finance team, you will define, refine, and deliver set goals for our firm. You will collaborate across the firm to provide comprehensive analysis and oversight of our reporting processes. In your role as a Firmwide Regulatory Reporting & Analysis (FRRA) – Data Controllers & Reporting (DCR) – Associate, you will work with teams on production processing and reporting activities, focusing on U.S. Regulatory Reports like FR Y-9C, Call Report, and CCAR. The FRRA team, part of Corporate Finance, is responsible for executing the Firm’s regulatory reporting requirements to U.S. regulators, ensuring accuracy and consistency in reporting and capital stress testing submissions. We are the DCR team within FRRA, a diverse global organization committed to data completeness and accuracy across 25+ jurisdictions. Our mission involves data sourcing, validations, adjustment processing, and reconciliations to support our financial reporting platform. Job Responsibilities Manage BAU activities, including data sourcing, data validation and completeness, adjustments processing, and performing reconciliations. Execute overall operating model and procedures for functional areas in the reporting space. Manage client relations, communications, and presentations. Support business users of the FRI application with user queries and issue resolutions. Identify and execute process improvements to the existing operating model, tools, and procedures. Interact with Controllers, Report owners, and RFT (Risk & Finance Technology) partners. Act as an interface with Control partners, ensuring compliance with risk and controls policies. Escalate issues as needed to the appropriate team(s) and management. Partner with projects team through the full project life cycles. Lead programs and initiatives for reporting automation and operating model optimization. Required Qualifications, Skills, And Capabilities Bachelor’s degree in Accounting, Finance, or a related discipline 3+ years of financial services or related experience Strong oral and written communication with the ability to effectively partner with managers and stakeholders at all levels Strong working knowledge of MS office applications (MS Excel, MS Word, MS PowerPoint), specifically with reconciliations, summarizing and formatting data Experience using data management & visualization tools in a reporting setting: AWS Databricks, Alteryx, SQL, Tableau, Visio Client & business focused; able to work collaboratively and build strong partnerships with clients and colleagues at all levels Aptitude and desire to learn quickly, be flexible, and think strategically Strong process and project management skills Preferred Qualifications, Skills, And Capabilities Familiarity with US Regulatory reporting (E.g. Y9C, Call, CCAR etc.), controllership functions, banking & brokerage products, and US GAAP accounting principles Control mindset and exposure to establishing or enhancing existing controls Strong verbal and written communication skill with the ability to present information at varying levels of detail depending on the audience Enthusiastic, self-motivated, effective under pressure and strong work ethic and keen attention to detail and accuracy ABOUT US JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. About The Team Our professionals in our Corporate Functions cover a diverse range of areas from finance and risk to human resources and marketing. Our corporate teams are an essential part of our company, ensuring that we’re setting our businesses, clients, customers and employees up for success. Global Finance & Business Management works to strategically manage capital, drive growth and efficiencies, maintain financial reporting and proactively manage risk. By providing information, analysis and recommendations to improve results and drive decisions, teams ensure the company can navigate all types of market conditions while protecting our fortress balance sheet. Show more Show less

Posted 3 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

I am thrilled to share an exciting opportunity with one of our esteemed clients! 🚀 Join me in exploring new horizons and unlocking potential. If you're ready for a challenge and growth,. Exp: 7+yrs Location: Chennai, Hyderabad Immediate joiner, WFO Mandatory skillsets: Data Modeler with R&D Domain must Qualifications: • Experience in business data management, information architecture, or other related field • Bachelor’s degree in Computer Science, Information Systems, Data Management, or related field. • Hands-on experience in data modeling, data or information architecture within Pharma R&D. • Demonstrated ability to design scalable, complex data models from conceptualization through physical optimization, aligned to business needs. • Strong understanding of R&D data domains (Research, Clinical Operations, Safety, Regulatory) and related compliance requirements (GxP, validation). • Solid grasp of data integration patterns, ETL/ELT pipelines, and source-to-target mapping. • Experience with data governance and data management tools (Databricks, Informatica, Reltio, ERwin, and storage platforms for structured and unstructured data like Azure, AWS, Oracle both cloud and on- prem) • Familiarity with SDLC, change control, and quality/compliance processes. • Excellent analytical, communication, and stakeholder-management skills. • Strategic thinker with an eye for detail and a passion for data excellence. • Collaborative mindset, adept at influencing across matrixed organizations. • Proactive learner who stays current on data modeling trends and best practices. • Customer-focused, with a commitment to delivering high-quality, fit-for-purpose data solutions. Travel: Minimal Regards R Usha usha@livecjobs.com Show more Show less

Posted 3 days ago

Apply

0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

Hi {fullName} There is an opportunity for Data Engineer IN KOLKATA for which WALKIN interview at KOLKATA is there on 21st JUN 25 between 9:30 AM to 12:30 PM PLS SHARE below details to mamidi.p@tcs.com with subject line as Data Engineer 21st jun 25 if you are interested Email id: Contact no: Total EXP: Preferred Location: CURRENT CTC: EXPECTED CTC: NOTICE PERIOD: CURRENT ORGANIZATION: HIGHEST QUALIFICATION THAT IS FULL TIME : HIGHEST QUALIFICATION UNIVERSITY: ANY GAP IN EDUCATION OR EMPLOYMENT: IF YES HOW MANY YEARS AND REASON FOR GAP: ARE U AVAILABLE FOR WALKIN INTERVIEW AT KOLKATA ON 21ST JUN 25(YES/NO): Desired Competencies (Technical/Behavioral Competency) Must-Have** (Ideally should not be more than 3-5) 1. Azure Data Factory 2. Azure Data Bricks 3. Python 4. Sql Query writing Good-to-Have 1. Pysspark 2. SQL query optimization 3. Power shell SN Responsibility of / Expectations from the Role 1 Developing/design solution from detail design specification. 2 Playing an active role in defining standard in coding, system design and architecture. 3 Revise, refactor, update and debug code. 4 Customer interaction. 5 Must have strong technical background and hands on coding experience in Azure Data Factory. Azure Databricks and SQL. THANKS & REGARDS PRIYANKA MAMIDI Show more Show less

Posted 3 days ago

Apply

7.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Greetings from TCS Recruitment Team! Role: DATABRICKS LEAD/ DATABRICKS SOLUTION ARCHITECT/ DATABRICKS ML ENGINEER Years of experience: 7 to 18 Years Walk-In-Drive Location: Kochi Walk-in-Location Details: Tata Consultancy Services TCS Centre SEZ Unit, Infopark Kochi Phase 1, Infopark Kochi P.O, Kakkanad, Kochi - 682042, Kerala India Drive Time: 9 am to 1:00 PM Date: 21-Jun-25 Must have 5+ years of experience in data engineering or related fields At least 2-3 years of hands-on experience with Databricks (using Apache Spark, Delta Lake, etc.) Solid experience in working with big data technologies such as Hadoop, Spark, Kafka, or similar Experience with cloud platforms (AWS, Azure, or GCP) and cloud-native data tools Experience with machine learning frameworks and pipelines, particularly in Databricks. Experience with AI/ML model deployment, MLOps, and ML lifecycle management using Databricks and related tools. Regards Sundar V Show more Show less

Posted 3 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Role: Data Engineer with GCP Experience: 5 + years Location: Bangalore | Gurugram | Noida | Pune Notice: Immediate Joiners Mode: Hybrid JD: Develop and automate Python scripts for data processing and transformation. Design, implement, and manage data pipelines to facilitate seamless data integration and flow. Utilize GCP services, particularly BigQuery and Cloud Functions, to support data processing needs. Create and optimize advanced SQL queries for efficient data retrieval and manipulation in BigQuery. Collaborate with cross-functional teams to gather requirements and implement data solutions. Work with Apache and Databricks to enhance data processing capabilities. Show more Show less

Posted 3 days ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Welcome to Warner Bros. Discovery… the stuff dreams are made of. Who We Are… When we say, “the stuff dreams are made of,” we’re not just referring to the world of wizards, dragons and superheroes, or even to the wonders of Planet Earth. Behind WBD’s vast portfolio of iconic content and beloved brands, are the storytellers bringing our characters to life, the creators bringing them to your living rooms and the dreamers creating what’s next… From brilliant creatives, to technology trailblazers, across the globe, WBD offers career defining opportunities, thoughtfully curated benefits, and the tools to explore and grow into your best selves. Here you are supported, here you are celebrated, here you can thrive. Your New Role Our Media Platform Operations team is looking for a highly capable DBA manager that can manage and lead a team of DBA members. You will manage a team of professionals to design and develop database systems, provide guidance to database team on database structures and features. You will create standard procedures to enhance scalability and performance of existing database architecture. Troubleshoot complex database issues in accurate and timely manner. Maintain database disaster recovery procedures to ensure continuous availability and speedy recovery. Develop best practices for performance and operational efficiency. Provide regular updates to management on database project status. Stay updated with new database technologies and analyze such technologies to bring into scope of existing infrastructure. This role requires a strategic thinker with excellent leadership skills and in-depth technical expertise to ensure the reliability, security, and performance of our data infrastructure Your Role Accountabilities OPERATIONS/PROJECT MANAGEMENT Lead, mentor, and manage a team of DBAs, fostering a culture of continuous improvement and professional growth. Conduct regular performance reviews, providing feedback and setting goals for team members. Coordinate with HR for recruiting, interviewing, and hiring new team members as needed. Provide expert-level guidance in database design, performance tuning, backup and recovery strategies, and data modeling. Lead troubleshooting efforts for complex database issues and optimize existing systems for better performance and cost efficiency. Keep abreast of emerging database technologies and best practices, recommending innovations that can enhance data management capabilities. Collaborate with cross-functional teams, including software development, network infrastructure, and security, to align database operations with business needs. Serve as the primary point of contact for database-related audits and ensure compliance with data protection regulations. Establish and maintain key performance indicators (KPIs) for database operations and provide regular reports to senior management. Plan and manage database projects, including migrations, upgrades, and new deployments, ensuring they are completed on time and within budget. Allocate resources efficiently to meet project demands and balance team workload. Coordinates DBA activities with the infrastructure team to ensure database servers are built according to customer requirements in a timely manner. Working effectively with a team that is globally dispersed. Serves as a mentor for Database Administrators and Associate Database Administrators. Provides additional support and guidance to DBAs/Associate DBAs with regards to problem solving, escalations and day to day work related challenges. Provide 24/7 support. STRATEGY Develop and execute a strategic roadmap for database management aligned with organizational objectives Oversee the design, deployment, and management of high-availability and disaster recovery solutions. Ensure databases are secure, scalable, and meet the performance requirements of applications. assess the skill set of the database team members and come up with a plan to bridge the gap by providing training or mentoring. Collaborate with IT and business leaders to define the architecture and roadmaps for future Database projects Lead the team to participate in identifying, proposing and implementing new and emerging technologies to support ongoing projects and business operations. Guide the team in conducting research, developing, and implementing new technologies to support future projects. Takes the lead in communicating with internal and external stakeholders. ANALYTICS Implement robust monitoring and alerting mechanisms to quickly detect and resolve database-related issues. Review the existing database standard documents and come up with new standards for all our database platforms (oracle, SQL Server, SAP Hana, db2, mysql, Potgresql, snowflake, & databricks) Lead the team to develop database structures and features according to organizational needs. Helps build the structure and design of the database. Lead the team for Creating, reviewing and maintaining database documentation. Lead the team to perform and plan upgrades and re-platforms to align with the company’s vision. Creating, reviewing and maintaining operational documentation that can be used by our 24/7 operations team and junior database administrators. Qualifications & Experiences Bachelor's degree in computer science, information systems, or information technology. 10+ years of experience in database management and leading database team. Proficiency in SQL and experience with multiple relational and non-relational database systems such as Oracle, SQL Server, MySQL, PostgreSQL, MongoDB, etc. Strong understanding of data architecture, data integration, and ETL processes. Familiarity with cloud database solutions (e.g., AWS RDS, Azure SQL, Google Cloud Spanner) and hybrid environments. Excellent problem-solving skills and ability to work under pressure. Exemplary communication skills, with the ability to communicate complex technical concepts to non-technical stakeholders. Very good experience in database administration of the various database platforms (oracle, SQL Server, SAP Hana, db2, mysql, Potgresql, snowflake, & databricks). Experience working on databases that are hosted both on-perm and AWS cloud. Good working experience AWS RDS databases is required. experience automating, scripting, and streamlining processes for efficiency and accuracy utilizing Unix shell scripting and Windows BAT. Ability and experience with the development of processes and procedures to standardize Database installations and configuration. Extensive experience with implementation and maintenance of Disaster Recovery and High availability. Ability to work on unusually complex technical problems and provide solutions that are highly innovative and ingenious. Ability to provide technical documentation and project plans for technical staff members. Excellent communication, presentation, and customer relationship skills. Must have the legal right to work in the United States. Ability to provide 24/7 support. Not Required But Preferred Experience Public speaking and presentation skills. How We Get Things Done… This last bit is probably the most important! Here at WBD, our guiding principles are the core values by which we operate and are central to how we get things done. You can find them at www.wbd.com/guiding-principles/ along with some insights from the team on what they mean and how they show up in their day to day. We hope they resonate with you and look forward to discussing them during your interview. Championing Inclusion at WBD Warner Bros. Discovery embraces the opportunity to build a workforce that reflects a wide array of perspectives, backgrounds and experiences. Being an equal opportunity employer means that we take seriously our responsibility to consider qualified candidates on the basis of merit, regardless of sex, gender identity, ethnicity, age, sexual orientation, religion or belief, marital status, pregnancy, parenthood, disability or any other category protected by law. If you’re a qualified candidate with a disability and you require adjustments or accommodations during the job application and/or recruitment process, please visit our accessibility page for instructions to submit your request. Show more Show less

Posted 3 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Hi {fullName} There is an opportunity for Azure Devops (CI/CD Pipelines, Migration) IN Hyderabad for which WALKIN interviewAT HYDERABAD is there on 21st JUN 25 between 9:30 AM TO 12:30 PM PLS SHARE below details to mamidi.p@tcs.com with subject line as Azure Devops 21st JUN 25 if you are interested Email id: Contact no: Total EXP: Preferred Location: CURRENT CTC: EXPECTED CTC: NOTICE PERIOD: CURRENT ORGANIZATION: HIGHEST QUALIFICATION THAT IS FULL TIME : HIGHEST QUALIFICATION UNIVERSITY: ANY GAP IN EDUCATION OR EMPLOYMENT: IF YES HOW MANY YEARS AND REASON FOR GAP: ARE U AVAILABLE FOR WALKIN INTERVIEW AT HYDERABAD ON 21ST JUN 25(YES/NO): We will share a mail to you by tom Night if you are shortlisted. Desired Competencies (Technical/Behavioral Competency) Must-Have Extensive automation experience in Azure DevOps Practical delivery experience in Infrastructure as Code (IaC) using bicep. Automate deployment tasks using scripts (e.g., PowerShell, Bash) within your CI/CD pipelines. Understand how to integrate the code artifacts of the following Azure data services into CI/CD pipelines: Azure Data Factory, Azure Databricks, Azure Machine Learning, SQL Database, Azure OpenAI Experience in Azure CI (integrate developers code into shared repo) and CD (deploy both IaC script and code artifacts into Dev, Test , and UAT environments). Good-to-Have Terraform experience is a plus Roles & Responsibilities Own and deliver DevOps pipelines for multiple, diverse devops teams Define, propose and execute features roadmap towards developing strong and mature DevOps pipelines Lead the design discussions and development of innovative cloud and DevOps solutions Work on Azure cloud for infrastructure automation using cloud native services Show more Show less

Posted 3 days ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Welcome to Warner Bros. Discovery… the stuff dreams are made of. Who We Are… When we say, “the stuff dreams are made of,” we’re not just referring to the world of wizards, dragons and superheroes, or even to the wonders of Planet Earth. Behind WBD’s vast portfolio of iconic content and beloved brands, are the storytellers bringing our characters to life, the creators bringing them to your living rooms and the dreamers creating what’s next… From brilliant creatives, to technology trailblazers, across the globe, WBD offers career defining opportunities, thoughtfully curated benefits, and the tools to explore and grow into your best selves. Here you are supported, here you are celebrated, here you can thrive. Your New Role Our Media Platform Operations team is looking for a highly capable DBA manager that can manage and lead a team of DBA members. You will manage a team of professionals to design and develop database systems, provide guidance to database team on database structures and features. You will create standard procedures to enhance scalability and performance of existing database architecture. Troubleshoot complex database issues in accurate and timely manner. Maintain database disaster recovery procedures to ensure continuous availability and speedy recovery. Develop best practices for performance and operational efficiency. Provide regular updates to management on database project status. Stay updated with new database technologies and analyze such technologies to bring into scope of existing infrastructure. This role requires a strategic thinker with excellent leadership skills and in-depth technical expertise to ensure the reliability, security, and performance of our data infrastructure Your Role Accountabilities OPERATIONS/PROJECT MANAGEMENT Lead, mentor, and manage a team of DBAs, fostering a culture of continuous improvement and professional growth. Conduct regular performance reviews, providing feedback and setting goals for team members. Coordinate with HR for recruiting, interviewing, and hiring new team members as needed. Provide expert-level guidance in database design, performance tuning, backup and recovery strategies, and data modeling. Lead troubleshooting efforts for complex database issues and optimize existing systems for better performance and cost efficiency. Keep abreast of emerging database technologies and best practices, recommending innovations that can enhance data management capabilities. Collaborate with cross-functional teams, including software development, network infrastructure, and security, to align database operations with business needs. Serve as the primary point of contact for database-related audits and ensure compliance with data protection regulations. Establish and maintain key performance indicators (KPIs) for database operations and provide regular reports to senior management. Plan and manage database projects, including migrations, upgrades, and new deployments, ensuring they are completed on time and within budget. Allocate resources efficiently to meet project demands and balance team workload. Coordinates DBA activities with the infrastructure team to ensure database servers are built according to customer requirements in a timely manner. Working effectively with a team that is globally dispersed. Serves as a mentor for Database Administrators and Associate Database Administrators. Provides additional support and guidance to DBAs/Associate DBAs with regards to problem solving, escalations and day to day work related challenges. Provide 24/7 support. STRATEGY Develop and execute a strategic roadmap for database management aligned with organizational objectives Oversee the design, deployment, and management of high-availability and disaster recovery solutions. Ensure databases are secure, scalable, and meet the performance requirements of applications. assess the skill set of the database team members and come up with a plan to bridge the gap by providing training or mentoring. Collaborate with IT and business leaders to define the architecture and roadmaps for future Database projects Lead the team to participate in identifying, proposing and implementing new and emerging technologies to support ongoing projects and business operations. Guide the team in conducting research, developing, and implementing new technologies to support future projects. Takes the lead in communicating with internal and external stakeholders. ANALYTICS Implement robust monitoring and alerting mechanisms to quickly detect and resolve database-related issues. Review the existing database standard documents and come up with new standards for all our database platforms (oracle, SQL Server, SAP Hana, db2, mysql, Potgresql, snowflake, & databricks) Lead the team to develop database structures and features according to organizational needs. Helps build the structure and design of the database. Lead the team for Creating, reviewing and maintaining database documentation. Lead the team to perform and plan upgrades and re-platforms to align with the company’s vision. Creating, reviewing and maintaining operational documentation that can be used by our 24/7 operations team and junior database administrators. Qualifications & Experiences Bachelor's degree in computer science, information systems, or information technology. 10+ years of experience in database management and leading database team. Proficiency in SQL and experience with multiple relational and non-relational database systems such as Oracle, SQL Server, MySQL, PostgreSQL, MongoDB, etc. Strong understanding of data architecture, data integration, and ETL processes. Familiarity with cloud database solutions (e.g., AWS RDS, Azure SQL, Google Cloud Spanner) and hybrid environments. Excellent problem-solving skills and ability to work under pressure. Exemplary communication skills, with the ability to communicate complex technical concepts to non-technical stakeholders. Very good experience in database administration of the various database platforms (oracle, SQL Server, SAP Hana, db2, mysql, Potgresql, snowflake, & databricks). Experience working on databases that are hosted both on-perm and AWS cloud. Good working experience AWS RDS databases is required. experience automating, scripting, and streamlining processes for efficiency and accuracy utilizing Unix shell scripting and Windows BAT. Ability and experience with the development of processes and procedures to standardize Database installations and configuration. Extensive experience with implementation and maintenance of Disaster Recovery and High availability. Ability to work on unusually complex technical problems and provide solutions that are highly innovative and ingenious. Ability to provide technical documentation and project plans for technical staff members. Excellent communication, presentation, and customer relationship skills. Must have the legal right to work in the United States. Ability to provide 24/7 support. Not Required But Preferred Experience Public speaking and presentation skills. How We Get Things Done… This last bit is probably the most important! Here at WBD, our guiding principles are the core values by which we operate and are central to how we get things done. You can find them at www.wbd.com/guiding-principles/ along with some insights from the team on what they mean and how they show up in their day to day. We hope they resonate with you and look forward to discussing them during your interview. Championing Inclusion at WBD Warner Bros. Discovery embraces the opportunity to build a workforce that reflects a wide array of perspectives, backgrounds and experiences. Being an equal opportunity employer means that we take seriously our responsibility to consider qualified candidates on the basis of merit, regardless of sex, gender identity, ethnicity, age, sexual orientation, religion or belief, marital status, pregnancy, parenthood, disability or any other category protected by law. If you’re a qualified candidate with a disability and you require adjustments or accommodations during the job application and/or recruitment process, please visit our accessibility page for instructions to submit your request. Show more Show less

Posted 3 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities Establish and implement best practices for DBT workflows, ensuring efficiency, reliability, and maintainability. Collaborate with data analysts, engineers, and business teams to align data transformations with business needs. Monitor and troubleshoot data pipelines to ensure accuracy and performance. Work with Azure-based cloud technologies to support data storage, transformation, and processing Preferred Education Master's Degree Required Technical And Professional Expertise Strong MS SQL, Azure Databricks experience Implement and manage data models in DBT, data transformation and alignment with business requirements. Ingest raw, unstructured data into structured datasets to cloud object store. Utilize DBT to convert raw, unstructured data into structured datasets, enabling efficient analysis and reporting. Write and optimize SQL queries within DBT to enhance data transformation processes and improve overall performance Preferred Technical And Professional Experience Establish best DBT processes to improve performance, scalability, and reliability. Design, develop, and maintain scalable data models and transformations using DBT in conjunction with Databricks Proven interpersonal skills while contributing to team effort by accomplishing related results as required Show more Show less

Posted 3 days ago

Apply

7.0 years

0 Lacs

Mumbai, Maharashtra, India

Remote

Linkedin logo

About This Role About Aladdin Financial Engineering (AFE): Join a diverse and collaborative team of over 300 modelers and technologists in Aladdin Financial Engineering (AFE) within BlackRock Solutions, the business responsible for the research and development of Aladdin’s financial models. This group is also accountable for analytics production, enhancing the infrastructure platform and delivering analytics content to portfolio and risk management professionals (both within BlackRock and across the Aladdin client community). The models developed and supported by AFE span a wide array of financial products covering equities, fixed income, commodities, derivatives, and private markets. AFE provides investment insights that range from an analysis of cash flows on a single bond, to the overall financial risk associated with an entire portfolio, balance sheet, or enterprise. Role Description We are looking for a person to join the Advanced Data Analytics team with AFE Single Security. Advanced Data Analytics is a team of Quantitative Data and Product Specialists, focused on delivering Single Security Data Content, Governance and Product Solutions and Research Platform. The team leverages data, cloud, and emerging technologies in building an innovative data platform, with the focus on business and research use cases in the Single Security space. The team uses various statistical/mathematical methodologies to derive insights and generate content to help develop predictive models, clustering, and classification solutions and enable Governance. The team works on Mortgage, Structured & Credit Products. We are looking for a person to work with a specialized focus on Data & Model governance and expand to working on the derived data and analytics content in MBS, Structured Products and Credit space." Experience Experience on Scala Knowledge of ETL, data curation and analytical jobs using distributed computing framework with Spark Knowledge and Experience of working with large enterprise databases like Snowflake, Cassandra & Cloud managed services like Dataproc, Databricks Knowledge of financial instruments like Corporate Bonds, Derivatives etc. Knowledge of regression methodologies Aptitude for design and building tools for Data Governance Python knowledge is a plus Qualifications Bachelors/master's in computer science with a major in Math, Econ, or related field 7+ years of relevant experience Our Benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law. Show more Show less

Posted 3 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Description Summary Designs, develops, tests, debugs and implements more complex operating systems components, software tools, and utilities with full competency. Coordinates with users to determine requirements. Reviews systems under development and related documentation. Makes more complex modifications to existing software to fit specialized needs and configurations, and maintains program libraries and technical documentation. May coordinate activities of the project team and assist in monitoring project schedules and costs. Essential Duties And Responsibilities Lead and Manage configuration, maintenance, and support of portfolio of AI models and related products. Manage model delivery to Production deployment team and coordinate model production deployments. Ability to analyze complex data requirements, understand exploratory data analysis, and design solutions that meet business needs. Work on analyzing data profiles, transformation, quality and security with the dev team to build and enhance data pipelines while maintaining proper quality and control around the data sets. Work closely with cross-functional teams, including business analysts, data engineers, and domain experts. Understand business requirements and translate them into technical solutions. Understand and review the business use cases for data pipelines for the Data Lake including ingestion, transformation and storing in the Lakehouse. Present architecture and solutions to executive-level. Minimum Qualifications Bachelor's or master's degree in computer science, Engineering, or related technical field Minimum of 5 years' experience in building data pipelines for both structured and unstructured data. At least 2 years' experience in Azure data pipeline development. Preferably 3 or more years' experience with Hadoop, Azure Databricks, Stream Analytics, Eventhub, Kafka, and Flink. Strong proficiency in Python and SQL Experience with big data technologies (Spark, Hadoop, Kafka) Familiarity with ML frameworks (TensorFlow, PyTorch, scikit-learn) Knowledge of model serving technologies (TensorFlow Serving, MLflow, KubeFlow) will be a plus Experience with one pof the cloud platforms (Azure preferred) and their Data Services. Understanding ML services will get preference. Understanding of containerization and orchestration (Docker, Kubernetes) Experience with data versioning and ML experiment tracking will be great addition Knowledge of distributed computing principles Familiarity with DevOps practices and CI/CD pipelines Preferred Qualifications Bachelor's degree in Computer Science or equivalent work experience. Experience with Agile/Scrum methodology. Experience with tax and accounting domain a plus. Azure Data Scientist certification a plus. Applicants may be required to appear onsite at a Wolters Kluwer office as part of the recruitment process. Show more Show less

Posted 3 days ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Title: Lead Data Engineer – C12 / Assistant Vice President (India) The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 8 to 12 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills An inclination to mentor; an ability to lead and deliver medium sized components independently T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Expertise around Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Data Governance: A strong grasp of principles and practice including data quality, security, privacy and compliance Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Experience of using a Job scheduler e.g., Autosys. Exposure to Business Intelligence tools e.g., Tableau, Power BI Certification on any one or more of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less

Posted 3 days ago

Apply

1.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

About The Team Rubrik is on a mission to secure the world’s data and our Information Technology Team is committed to supporting this mission. As part of the newly founded IT AI team, you’ll be pivotal in driving AI-powered transformation, enabling smarter automation, data-driven insights, and scalable solutions that empower Rubrik’s mission. About The Role We are seeking an experienced GenAI Engineer to join our Data Engineering team, with a focus on building AI Agents and workflows. The successful candidate will work on integrating data sources or building MCP clients/servers to support the development and deployment of LLM based Agents and bots. You will work closely with business teams and fellow data engineers enabling the Data Engineering team to leverage Gen AI tools for advanced data solutions. Collaborate with business teams and data engineers to empower the Data Engineering team's adoption of Gen AI tools for creating sophisticated data solutions. An experienced AI Data Engineer is needed to join our Data Engineering team, focusing on the development of AI Agents and workflows. The ideal candidate will be responsible for integrating data sources and building MCP clients/servers to facilitate the development and deployment of LLM-based Agents and bots. This role involves close collaboration with data scientists and engineers to ensure smooth data integration and flow, enabling the Data Engineering team to utilize GenAI tools for sophisticated data solutions. What You’ll Do Design and develop data integrations through MCP protocols or traditional data extractionmechanismsDesign and build data integrations utilizing MCP protocols or conventional data extraction methods. Leverage Snowflake Cortex, Gemini Agentspace or similar tools to build scalable and efficient data solutions for AI workloads, enabling the Data Engineering team to generate high-quality data products from unstructured and structured data Ensure data quality, integrity, and scalability for large-scale AI workloads, supporting the development of Gen AI models Collaborate with business teams, data engineers and application developers to deliver products helping streamline business processes, Work with business teams, data engineers, and application developers to create products that improve business processes or lead to top line growth or bottom line improvements. Integrate data pipelines with existing infrastructure, enabling seamless data flow and analytics Design and develop scalable data pipelines for GenAI model training and deployment. Utilize tools like Snowflake Cortex and Databricks LLM (Mosaic AI, RAG, Model Serving). Leverage platforms such as Snowflake Cortex and Gemini Agentspace. Create efficient data solutions for AI workloads. Enable the Data Engineering team to produce high-quality data products (unstructured and structured). Ensure data quality, integrity, and scalability for large AI workloads supporting GenAI model development. Collaborate with business teams, data engineers, and application developers. Deliver products that streamline business processes or drive revenue and efficiency. Integrate data pipelines with existing infrastructure. Ensure seamless data flow and analytics. Experience You’ll Need 1+ years of experience building AI Agents or leveraging Snowflake Cortex, Gemini Agentspace or similar open source tooling 3+ years of experience in data engineering, with a focus on AI/ML workloads 5+ years of experience working in Data Analytics either Snowflake or Databricks Strong programming skills in languages like Python, Java, or Scala Knowledge of data storage solutions (e.g., Snowflake, Databricks) and data APIs Experience with cloud configuration and data governance Strong problem-solving skills and ability to work in a fast-paced environment Experience with large language models (LLMs) like transformer-based models, and frameworks like LangChain or similar. Preferred Qualifications Building AI Agents and Agentic workflows Experience leveraging MCP, Agent2Agent Protocols Knowledge of generative models and their applications in data engineering Experience with data governance and security best practices for Gen AI workloads Experience with Agile development methodologies and collaboration tools (e.g., Jira, GitHub) Join Us in Securing the World's Data Rubrik (NYSE: RBRK) is on a mission to secure the world’s data. With Zero Trust Data Security™, we help organizations achieve business resilience against cyberattacks, malicious insiders, and operational disruptions. Rubrik Security Cloud, powered by machine learning, secures data across enterprise, cloud, and SaaS applications. We help organizations uphold data integrity, deliver data availability that withstands adverse conditions, continuously monitor data risks and threats, and restore businesses with their data when infrastructure is attacked. Linkedin | X (formerly Twitter) | Instagram | Rubrik.com Inclusion @ Rubrik At Rubrik, we are dedicated to fostering a culture where people from all backgrounds are valued, feel they belong, and believe they can succeed. Our commitment to inclusion is at the heart of our mission to secure the world’s data. Our goal is to hire and promote the best talent, regardless of background. We continually review our hiring practices to ensure fairness and strive to create an environment where every employee has equal access to opportunities for growth and excellence. We believe in empowering everyone to bring their authentic selves to work and achieve their fullest potential. Our inclusion strategy focuses on three core areas of our business and culture: Our Company: We are committed to building a merit-based organization that offers equal access to growth and success for all employees globally. Your potential is limitless here. Our Culture: We strive to create an inclusive atmosphere where individuals from all backgrounds feel a strong sense of belonging, can thrive, and do their best work. Your contributions help us innovate and break boundaries. Our Communities: We are dedicated to expanding our engagement with the communities we operate in, creating opportunities for underrepresented talent and driving greater innovation for our clients. Your impact extends beyond Rubrik, contributing to safer and stronger communities. Equal Opportunity Employer/Veterans/Disabled Rubrik is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, or protected veteran status and will not be discriminated against on the basis of disability. Rubrik provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, sex, national origin, age, disability or genetics. In addition to federal law requirements, Rubrik complies with applicable state and local laws governing nondiscrimination in employment in every location in which the company has facilities. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. Federal law requires employers to provide reasonable accommodation to qualified individuals with disabilities. Please contact us at hr@rubrik.com if you require a reasonable accommodation to apply for a job or to perform your job. Examples of reasonable accommodation include making a change to the application process or work procedures, providing documents in an alternate format, using a sign language interpreter, or using specialized equipment. EEO IS THE LAW NOTIFICATION OF EMPLOYEE RIGHTS UNDER FEDERAL LABOR LAWS Show more Show less

Posted 3 days ago

Apply

5.0 years

0 Lacs

India

On-site

Linkedin logo

About Oportun Oportun (Nasdaq: OPRT) is a mission-driven fintech that puts its 2.0 million members' financial goals within reach. With intelligent borrowing, savings, and budgeting capabilities, Oportun empowers members with the confidence to build a better financial future. Since inception, Oportun has provided more than $16.6 billion in responsible and affordable credit, saved its members more than $2.4 billion in interest and fees, and helped its members save an average of more than $1,800 annually. Oportun has been certified as a Community Development Financial Institution (CDFI) since 2009. WORKING AT OPORTUN Working at Oportun means enjoying a differentiated experience of being part of a team that fosters a diverse, equitable and inclusive culture where we all feel a sense of belonging and are encouraged to share our perspectives. This inclusive culture is directly connected to our organization's performance and ability to fulfill our mission of delivering affordable credit to those left out of the financial mainstream. We celebrate and nurture our inclusive culture through our employee resource groups. Position Overview As a Sr. Data Engineer at Oportun, you will be a key member of our team, responsible for designing, developing, and maintaining sophisticated software / data platforms in achieving the charter of the engineering group. Your mastery of a technical domain enables you to take up business problems and solve them with a technical solution. With your depth of expertise and leadership abilities, you will actively contribute to architectural decisions, mentor junior engineers, and collaborate closely with cross-functional teams to deliver high-quality, scalable software solutions that advance our impact in the market. This is a role where you will have the opportunity to take up responsibility in leading the technology effort – from technical requirements gathering to final successful delivery of the product - for large initiatives (cross-functional and multi-month-long projects). Responsibilities Data Architecture and Design: Lead the design and implementation of scalable, efficient, and robust data architectures to meet business needs and analytical requirements. Collaborate with stakeholders to understand data requirements, build subject matter expertise, and define optimal data models and structures. Data Pipeline Development And Optimization Design and develop data pipelines, ETL processes, and data integration solutions for ingesting, processing, and transforming large volumes of structured and unstructured data. Optimize data pipelines for performance, reliability, and scalability. Database Management And Optimization Oversee the management and maintenance of databases, data warehouses, and data lakes to ensure high performance, data integrity, and security. Implement and manage ETL processes for efficient data loading and retrieval. Data Quality And Governance Establish and enforce data quality standards, validation rules, and data governance practices to ensure data accuracy, consistency, and compliance with regulations. Drive initiatives to improve data quality and documentation of data assets. Mentorship And Leadership Provide technical leadership and mentorship to junior team members, assisting in their skill development and growth. Lead and participate in code reviews, ensuring best practices and high-quality code. Collaboration And Stakeholder Management Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand their data needs and deliver solutions that meet those needs. Communicate effectively with non-technical stakeholders to translate technical concepts into actionable insights and business value. Performance Monitoring And Optimization Implement monitoring systems and practices to track data pipeline performance, identify bottlenecks, and optimize for improved efficiency and scalability. Common Requirements You have a strong understanding of a business or system domain with sufficient knowledge & expertise around the appropriate metrics and trends. You collaborate closely with product managers, designers, and fellow engineers to understand business needs and translate them into effective solutions. You provide technical leadership and expertise, guiding the team in making sound architectural decisions and solving challenging technical problems. Your solutions anticipate scale, reliability, monitoring, integration, and extensibility. You conduct code reviews and provide constructive feedback to ensure code quality, performance, and maintainability. You mentor and coach junior engineers, fostering a culture of continuous learning, growth, and technical excellence within the team. You play a significant role in the ongoing evolution and refinement of current tools and applications used by the team, and drive adoption of new practices within your team. You take ownership of (customer) issues, including initial troubleshooting, identification of root cause and issue escalation or resolution, while maintaining the overall reliability and performance of our systems. You set the benchmark for responsiveness and ownership and overall accountability of engineering systems. You independently drive and lead multiple features, contribute to (a) large project(s) and lead smaller projects. You can orchestrate work that spans multiples engineers within your team and keep all relevant stakeholders informed. You support your lead/EM about your work and that of the team, that they need to share with the stakeholders, including escalation of issues Qualifications Bachelor's or Master's degree in Computer Science, Data Science, or a related field. 5+ years of experience in data engineering, with a focus on data architecture, ETL, and database management. Proficiency in programming languages like Python/PySpark and Java or Scala Expertise in big data technologies such as Hadoop, Spark, Kafka, etc. In-depth knowledge of SQL and experience with various database technologies (e.g., PostgreSQL, MariaDB, NoSQL databases). Experience and expertise in building complex end-to-end data pipelines. Experience with orchestration and designing job schedules using the CICD tools like Jenkins, Airflow or Databricks Ability to work in an Agile environment (Scrum, Lean, Kanban, etc) Ability to mentor junior team members. Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and their data services (e.g., AWS Redshift, S3, Azure SQL Data Warehouse). Strong leadership, problem-solving, and decision-making skills. Excellent communication and collaboration abilities. Familiarity or certification in Databricks is a plus. We are proud to be an Equal Opportunity Employer and consider all qualified applicants for employment opportunities without regard to race, age, color, religion, gender, national origin, disability, sexual orientation, veteran status or any other category protected by the laws or regulations in the locations where we operate. California applicants can find a copy of Oportun's CCPA Notice here: https://oportun.com/privacy/california-privacy-notice/. We will never request personal identifiable information (bank, credit card, etc.) before you are hired. We do not charge you for pre-employment fees such as background checks, training, or equipment. If you think you have been a victim of fraud by someone posing as us, please report your experience to the FBI’s Internet Crime Complaint Center (IC3). Show more Show less

Posted 3 days ago

Apply

25.0 years

4 - 7 Lacs

Cochin

On-site

Company Overview Milestone Technologies is a global IT managed services firm that partners with organizations to scale their technology, infrastructure and services to drive specific business outcomes such as digital transformation, innovation, and operational agility. Milestone is focused on building an employee-first, performance-based culture and for over 25 years, we have a demonstrated history of supporting category-defining enterprise clients that are growing ahead of the market. The company specializes in providing solutions across Application Services and Consulting, Digital Product Engineering, Digital Workplace Services, Private Cloud Services, AI/Automation, and ServiceNow. Milestone culture is built to provide a collaborative, inclusive environment that supports employees and empowers them to reach their full potential. Our seasoned professionals deliver services based on Milestone’s best practices and service delivery framework. By leveraging our vast knowledge base to execute initiatives, we deliver both short-term and long-term value to our clients and apply continuous service improvement to deliver transformational benefits to IT. With Intelligent Automation, Milestone helps businesses further accelerate their IT transformation. The result is a sharper focus on business objectives and a dramatic improvement in employee productivity. Through our key technology partnerships and our people-first approach, Milestone continues to deliver industry-leading innovation to our clients. With more than 3,000 employees serving over 200 companies worldwide, we are following our mission of revolutionizing the way IT is deployed. Job Overview In this vital role you will be responsible for the development and implementation of our data strategy. The ideal candidate possesses a strong blend of technical expertise and data-driven problem-solving skills. As a Data Engineer, you will play a crucial role in building, and optimizing our data pipelines and platforms in a SAFE Agile product team. Chip in to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions. Deliver for data pipeline projects from development to deployment, managing, timelines, and risks. Ensure data quality and integrity through meticulous testing and monitoring. Leverage cloud platforms (AWS, Databricks) to build scalable and efficient data solutions. Work closely with product team, and key collaborators to understand data requirements. Enforce to data engineering industry standards and standards. Experience developing in an Agile development environment, and comfortable with Agile terminology and ceremonies. Familiarity with code versioning using GIT and code migration tools. Familiarity with JIRA. Stay up to date with the latest data technologies and trends What we expect of you Basic Qualifications: Doctorate degree OR Master’s degree and 4 to 6 years of Information Systems experience OR Bachelor’s degree and 6 to 8 years of Information Systems experience OR Diploma and 10 to 12 years of Information Systems experience. Demonstrated hands-on experience with cloud platforms (AWS, Azure, GCP) Proficiency in Python, PySpark, SQL. Development knowledge in Databricks. Good analytical and problem-solving skills to address sophisticated data challenges. Preferred Qualifications: Experienced with data modeling Experienced working with ETL orchestration technologies Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and DevOps Familiarity with SQL/NOSQL database Soft Skills: Skilled in breaking down problems, documenting problem statements, and estimating efforts. Effective communication and interpersonal skills to collaborate with multi-functional teams. Excellent analytical and problem solving skills. Strong verbal and written communication skills Ability to work successfully with global teams High degree of initiative and self-motivation. Team-oriented, with a focus on achieving team goals Compensation Estimated Pay Range: Exact compensation and offers of employment are dependent on circumstances of each case and will be determined based on job-related knowledge, skills, experience, licenses or certifications, and location. Our Commitment to Diversity & Inclusion At Milestone we strive to create a workplace that reflects the communities we serve and work with, where we all feel empowered to bring our full, authentic selves to work. We know creating a diverse and inclusive culture that champions equity and belonging is not only the right thing to do for our employees but is also critical to our continued success. Milestone Technologies provides equal employment opportunity for all applicants and employees. All qualified applicants will receive consideration for employment and will not be discriminated against on the basis of race, color, religion, gender, gender identity, marital status, age, disability, veteran status, sexual orientation, national origin, or any other category protected by applicable federal and state law, or local ordinance. Milestone also makes reasonable accommodations for disabled applicants and employees. We welcome the unique background, culture, experiences, knowledge, innovation, self-expression and perspectives you can bring to our global community. Our recruitment team is looking forward to meeting you.

Posted 3 days ago

Apply

130.0 years

6 - 9 Lacs

Hyderābād

On-site

Job Description Manager senior data engineer The Opportunity Based in Hyderabad, join a global healthcare biopharma company and be part of a 130- year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Be part of an organisation driven by digital technology and data-backed approaches that support a diversified portfolio of prescription medicines, vaccines, and animal health products. Drive innovation and execution excellence. Be a part of a team with passion for using data, analytics, and insights to drive decision-making, and which creates custom software, allowing us to tackle some of the world's greatest health threats. Our Technology Centers focus on creating a space where teams can come together to deliver business solutions that save and improve lives. An integral part of our company’s IT operating model, Tech Centers are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each Tech Center helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centers. Role Overview : Our technology teams operate as business partners, proposing ideas and innovative solutions that enable new organizational capabilities. We collaborate internationally to deliver services and solutions that help everyone be more productive and enable innovation. Responsibilities Designs, builds, and maintains data pipeline architecture - ingest, process, and publish data for consumption. Batch processes collected data, formats data in an optimized way to bring it analyze-ready Ensures best practices sharing and across the organization Enables delivery of data-analytics projects Develops deep knowledge of the company's supported technology; understands the whole complexity/dependencies between multiple teams, platforms (people, technologies) Communicates intensively with other platform/competencies to comprehend new trends and methodologies being implemented/considered within the company ecosystem Understands the customer and stakeholders business needs/priorities and helps building solutions that support our business goals Establishes and manages the close relationship with customers/stakeholders Has overview of the date engineering market development to be able to come up/explore new ways of delivering pipelines to increase their value/contribution Builds “community of practice” leveraging experience from delivering complex analytics projects Is accountable for ensuring that the team delivers solutions with high quality standards, timeliness, compliance and excellent user experience Contributes to innovative experiments, specifically to idea generation, idea incubation and/or experimentation, identifying tangible and measurable criteria Qualifications: Bachelor’s degree in Computer Science, Data Science, Information Technology, Engineering or a related field. 3+ plus years of experience as a Data Engineer or in a similar role, with a strong portfolio of data projects. 3+ plus years experience SQL skills, with the ability to write and optimize queries for large datasets. 1+ plus years experience and proficiency in Python for data manipulation, automation, and pipeline development. Experience with Databricks including creating notebooks and utilizing Spark for big data processing. Strong experience with data warehousing solution (such as Snowflake), including schema design and performance optimization. Experience with data governance and quality management tools, particularly Collibra DQ. Strong analytical and problem-solving skills, with an attention to detail. Who we are: We are known as Merck & Co., Inc., Rahway, New Jersey, USA in the United States and Canada and MSD everywhere else. For more than a century, we have been inventing for life, bringing forward medicines and vaccines for many of the world's most challenging diseases. Today, our company continues to be at the forefront of research to deliver innovative health solutions and advance the prevention and treatment of diseases that threaten people and animals around the world. What we look for: Imagine getting up in the morning for a job as important as helping to save and improve lives around the world. Here, you have that opportunity. You can put your empathy, creativity, digital mastery, or scientific genius to work in collaboration with a diverse group of colleagues who pursue and bring hope to countless people who are battling some of the most challenging diseases of our time. Our team is constantly evolving, so if you are among the intellectually curious, join us—and start making your impact today. #HYDIT2025 Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status: Regular Relocation: VISA Sponsorship: Travel Requirements: Flexible Work Arrangements: Hybrid Shift: Valid Driving License: Hazardous Material(s): Required Skills: Business, Business, Business Intelligence (BI), Business Management, Contractor Management, Cost Reduction, Database Administration, Database Optimization, Data Engineering, Data Flows, Data Infrastructure, Data Management, Data Modeling, Data Optimization, Data Quality, Data Visualization, Design Applications, Information Management, Management Process, Operating Cost Reduction, Productivity Improvements, Project Engineering, Social Collaboration, Software Development, Software Development Life Cycle (SDLC) {+ 1 more} Preferred Skills: Job Posting End Date: 08/20/2025 A job posting is effective until 11:59:59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID: R350684

Posted 3 days ago

Apply

130.0 years

6 - 9 Lacs

Hyderābād

On-site

Job Description Manager Senior Data Engineer The Opportunity Based in Hyderabad, join a global healthcare biopharma company and be part of a 130- year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Be part of an organisation driven by digital technology and data-backed approaches that support a diversified portfolio of prescription medicines, vaccines, and animal health products. Drive innovation and execution excellence. Be a part of a team with passion for using data, analytics, and insights to drive decision-making, and which creates custom software, allowing us to tackle some of the world's greatest health threats. Our Technology Centers focus on creating a space where teams can come together to deliver business solutions that save and improve lives. An integral part of our company’s IT operating model, Tech Centers are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each Tech Center helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centers. Role Overview Our IT team operates as a business partner proposing ideas and innovative solutions that enable new organizational capabilities. We collaborate internationally to deliver the services and solutions that help everyone to be more productive and enable innovation. The candidate will work with a globally diverse set of teams which includes SAP Basis, Security, ABAP, SAP functional team members, Infrastructure team and other IT process partners providing support for existing and new initiatives. The candidate will work closely with and advise the SAP Technical Architect on architectural topics and new applications / technologies to be integrated. The candidate will lead some cross-functional projects, relied upon to answer complex questions, and assists with program-wide initiatives. Our organization is on a transformation journey and we envision using newer SAP technologies and infrastructure as part of this transformation and the candidate must have exposure to these new technologies. It is expected that the candidate will be able to both lead technical initiatives and be hands-on. Responsibilities Designs, builds, and maintains data pipeline architecture - ingest, process, and publish data for consumption. Batch processes collected data, formats data in an optimized way to bring it analyze-ready Ensures best practices sharing and across the organization Enables delivery of data-analytics projects Develops deep knowledge of the company's supported technology; understands the whole complexity/dependencies between multiple teams, platforms (people, technologies) Communicates intensively with other platform/competencies to comprehend new trends and methodologies being implemented/considered within the company ecosystem Understands the customer and stakeholders business needs/priorities and helps building solutions that support our business goals Establishes and manages the close relationship with customers/stakeholders Has overview of the date engineering market development to be able to come up/explore new ways of delivering pipelines to increase their value/contribution Builds “community of practice” leveraging experience from delivering complex analytics projects Is accountable for ensuring that the team delivers solutions with high quality standards, timeliness, compliance and excellent user experience Contributes to innovative experiments, specifically to idea generation, idea incubation and/or experimentation, identifying tangible and measurable criteria Qualifications: Bachelor’s degree in Computer Science, Data Science, Information Technology, Engineering or a related field. 5+ plus years of experience as a Data Engineer or in a similar role, with a strong portfolio of data projects. 3+ plus years experience SQL skills, with the ability to write and optimize queries for large datasets. 1+ plus years experience and proficiency in Python for data manipulation, automation, and pipeline development. Experience with Databricks including creating notebooks and utilizing Spark for big data processing. Strong experience with data warehousing solution (such as Snowflake), including schema design and performance optimization. Experience with data governance and quality management tools, particularly Collibra DQ. Strong analytical and problem-solving skills, with an attention to detail. Who we are: We are known as Merck & Co., Inc., Rahway, New Jersey, USA in the United States and Canada and MSD everywhere else. For more than a century, we have been inventing for life, bringing forward medicines and vaccines for many of the world's most challenging diseases. Today, our company continues to be at the forefront of research to deliver innovative health solutions and advance the prevention and treatment of diseases that threaten people and animals around the world. What we look for: Imagine getting up in the morning for a job as important as helping to save and improve lives around the world. Here, you have that opportunity. You can put your empathy, creativity, digital mastery, or scientific genius to work in collaboration with a diverse group of colleagues who pursue and bring hope to countless people who are battling some of the most challenging diseases of our time. Our team is constantly evolving, so if you are among the intellectually curious, join us—and start making your impact today. Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status: Regular Relocation: VISA Sponsorship: Travel Requirements: Flexible Work Arrangements: Hybrid Shift: Valid Driving License: Hazardous Material(s): Required Skills: Business Intelligence (BI), Database Administration, Data Engineering, Data Management, Data Modeling, Data Visualization, Design Applications, Information Management, Software Development, Software Development Life Cycle (SDLC), System Designs Preferred Skills: Job Posting End Date: 07/22/2025 A job posting is effective until 11:59:59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID: R350689

Posted 3 days ago

Apply

3.0 - 6.0 years

6 - 9 Lacs

Hyderābād

On-site

Senior Analyst – Data Engineer - Deloitte Technology - Deloitte Support Services India Private Limited Do you thrive on developing creative and innovative insights to solve complex challenges? Want to work on next-generation, cutting-edge products and services that deliver outstanding value and that are global in vision and scope? Work with premier thought leaders in your field? Work for a world-class organization that provides an exceptional career experience with an inclusive and collaborative culture? Work you’ll do Seeking a candidate with extensive experience on designing, delivering and maintaining implementations of solutions in the cloud, specifically Microsoft Azure. This candidate should also possess strong cross-discipline communication skills, strong analytical aptitude with critical thinking, a solid understanding of how data would translate into reporting / dashboarding capabilities, and the tools and platforms that support them. Responsibilities Role Specific Designing a well-structured data model using methodologies (e.g., Kimball or Inmon) that accurately represents the business requirements, ensures data integrity and minimizes redundancies. Developing and implementing data pipelines to extract, transform, and load (ETL) data from various sources into Azure data services. This includes using Azure Data Factory, Azure Databricks, or other tools to orchestrate data workflows and data movement. Build, Test and Run of data assets tied to tasks and user stories from the Azure DevOps instance of Enterprise Data & Analytics. Bring a level of technical expertise of the Big Data space that contributes to the strategic roadmaps for Enterprise Data Architecture, Global Data Cloud Architecture, and Global Business Intelligence Architecture, as well contributes to the development of the broader Enterprise Data & Analytics Engineering community Actively participate in regularly scheduled contact calls to transparently review the status of in-flight projects, priorities of backlog projects, and review adoption of previous deliveries from Enterprise Data & Analytics with the Data Insights team. Handle break fixes and participate in a rotational on-call schedule. On-call includes monitoring of scheduled jobs and ETL pipelines. Actively participate in team meetings to transparently review the status of in-flight projects and their progress. Follow standard practice and frameworks on each project from development, to testing and then productionizing, each within the appropriate environment laid out by Data Architecture. Challenge’s self and others to make an impact that matters and help team connect their contributions with broader purpose. Sets expectations to the team, aligns the work based on the strengths and competencies, and challenges them to raise the bar while providing the support. Extensive knowledge of multiple technologies, tools, and processes to improve the design and architecture of the assigned applications. Knowledge Sharing / Documentation Contribute to, produce, and maintain processes, procedures, operational and architectural documentation. Change Control - ensure compliance with Processes and adherence to standards and documentation. Work with Deloitte Technology leadership and service teams in reviewing documentation and aligning KPIs to critical steps in our service operations. Active participation in ongoing training within BI space. The team At Deloitte, we’re all about collaboration. And nowhere is this more apparent than among our 2,000-strong internal services team. With our combined specialist skills, we provide all the essential support and advice our client-facing colleagues need, right across the firm. This enables them to focus all of their efforts on delivering the best service possible to their clients. Covering seven distinct areas; Human Resources, Clients & Industries, Finance & Legal, Practice Support Services, Quality & Risk Services, IT Services, and Workplace Services & Real Estate, together we live, breathe and deliver the Deloitte experience. Location: Hyderabad Work shift Timings: 11 AM to 8 PM Qualifications Bachelor of Engineering/ Bachelor of Technology 3-6 years of broad-based IT experience with technical knowledge of Microsoft SQL Server, Azure SQL Data Warehouse, Azure Data Lake Store, Azure Data Factory Demonstrated experience in Apache Framework (Spark, Scala, etc.) Well versed in SQL and comfortable in scripting using Python or similar language. First Month Critical Outcomes: Absorb strategic projects from the backlog and complete the related Azure SQL Data Warehouse Development work. Inspect existing run-state SQL Server databases and Azure SQL Data Warehouses and identify optimizations for potential development. Deliver new databases assigned as needed. Integration to on-call rotation (First 90 days). Contribute to legacy content and architecture migration to data lake (First 90 days). Delivery of first 2 data ingestion pipelines to include ingestion, QA and automation using Azure Big Data tools (First 90 days). Ability to document all work following standard documentation practices set forth by Data Governance (First 90 days). How you’ll grow At Deloitte, we’ve invested a great deal to create a rich environment in which our professionals can grow. We want all our people to develop in their own way, playing to their own strengths as they hone their leadership skills. And, as a part of our efforts, we provide our professionals with a variety of learning and networking opportunities—including exposure to leaders, sponsors, coaches, and challenging assignments—to help accelerate their careers along the way. No two people learn in exactly the same way. So, we provide a range of resources including live classrooms, team-based learning, and eLearning. DU: The Leadership Center in India, our state-of-the-art, world-class learning Center in the Hyderabad offices is an extension of the Deloitte University (DU) in Westlake, Texas, and represents a tangible symbol of our commitment to our people’s growth and development. Explore DU: The Leadership Center in India Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Deloitte’s culture Our positive and supportive culture encourages our people to do their best work every day. We celebrate individuals by recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered, confident, and aware. We offer well-being programs and are continuously looking for new ways to maintain a culture that is inclusive, invites authenticity, leverages our diversity, and where our people excel and lead healthy, happy lives. Learn more about Life at Deloitte. Corporate citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people and our communities. We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our communities. Learn more about Deloitte’s impact on the world. #EAG-Technology Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 304653

Posted 3 days ago

Apply

7.0 years

0 Lacs

Gurgaon

On-site

Job Purpose Client calls, guide towards optimized, cloud-native architectures, future state of their data platform, strategic recommendations and Microsoft Fabric integration. Desired Skills and experience Candidates should have a B.E./B.Tech/MCA/MBA in Finance, Information Systems, Computer Science or a related field 7+ years of experience as a Data and Cloud architecture with client stakeholders AZ Data Platform Expertise: Synapse, Databricks, Azure Data Factory (ADF), Azure SQL (DW/DB), Power BI (PBI). Define modernization roadmaps and target architecture. Strong understanding of data governance best practices for data quality, Cataloguing, and lineage. Proven ability to lead client engagements and present complex findings. Excellent communication skills, both written and verbal Extremely strong organizational and analytical skills with strong attention to detail Strong track record of excellent results delivered to internal and external clients Able to work independently without the needs for close supervision and collaboratively as part of cross-team efforts Experience with delivering projects within an agile environment Experience in project management and team management Key responsibilities include: Lead all interviews & workshops to capture current/future needs. Direct the technical review of Azure (AZ) infrastructure (Databricks, Synapse Analytics, Power BI) and critical on-premises (on-prem) systems. Come up with architecture designs (Arch. Designs), focusing on refined processing strategies and Microsoft Fabric. Understand and refine the Data Governance (Data Gov.) roadmap, including data cataloguing (Data Cat.), lineage, and quality. Lead project deliverables, ensuring actionable and strategic outputs. Evaluate and ensure quality of deliverables within project timelines Develop a strong understanding of equity market domain knowledge Collaborate with domain experts and business stakeholders to understand business rules/logics Ensure effective, efficient, and continuous communication (written and verbally) with global stakeholders Independently troubleshoot difficult and complex issues on dev, test, UAT and production environments Responsible for end-to-end delivery of projects, coordination between client and internal offshore teams and manage client queries Demonstrate high attention to detail, should work in a dynamic environment whilst maintaining high quality standards, a natural aptitude to develop good internal working relationships and a flexible work ethic Responsible for Quality Checks and adhering to the agreed Service Level Agreement (SLA) / Turn Around Time (TAT)

Posted 3 days ago

Apply

Exploring Databricks Jobs in India

Databricks is a popular technology in the field of big data and analytics, and the job market for Databricks professionals in India is growing rapidly. Companies across various industries are actively looking for skilled individuals with expertise in Databricks to help them harness the power of data. If you are considering a career in Databricks, here is a detailed guide to help you navigate the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Chennai
  5. Mumbai

Average Salary Range

The average salary range for Databricks professionals in India varies based on experience level: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-25 lakhs per annum

Career Path

In the field of Databricks, a typical career path may include: - Junior Developer - Senior Developer - Tech Lead - Architect

Related Skills

In addition to Databricks expertise, other skills that are often expected or helpful alongside Databricks include: - Apache Spark - Python/Scala programming - Data modeling - SQL - Data visualization tools

Interview Questions

  • What is Databricks and how is it different from Apache Spark? (basic)
  • Explain the concept of lazy evaluation in Databricks. (medium)
  • How do you optimize performance in Databricks? (advanced)
  • What are the different cluster modes in Databricks? (basic)
  • How do you handle data skewness in Databricks? (medium)
  • Explain how you can schedule jobs in Databricks. (medium)
  • What is the significance of Delta Lake in Databricks? (advanced)
  • How do you handle schema evolution in Databricks? (medium)
  • What are the different file formats supported by Databricks for reading and writing data? (basic)
  • Explain the concept of checkpointing in Databricks. (medium)
  • How do you troubleshoot performance issues in Databricks? (advanced)
  • What are the key components of Databricks Runtime? (basic)
  • How can you secure your data in Databricks? (medium)
  • Explain the role of MLflow in Databricks. (advanced)
  • How do you handle streaming data in Databricks? (medium)
  • What is the difference between Databricks Community Edition and Databricks Workspace? (basic)
  • How do you set up monitoring and alerting in Databricks? (medium)
  • Explain the concept of Delta caching in Databricks. (advanced)
  • How do you handle schema enforcement in Databricks? (medium)
  • What are the common challenges faced in Databricks projects and how do you overcome them? (advanced)
  • How do you perform ETL operations in Databricks? (medium)
  • Explain the concept of MLflow Tracking in Databricks. (advanced)
  • How do you handle data lineage in Databricks? (medium)
  • What are the best practices for data governance in Databricks? (advanced)

Closing Remark

As you prepare for Databricks job interviews, make sure to brush up on your technical skills, stay updated with the latest trends in the field, and showcase your problem-solving abilities. With the right preparation and confidence, you can land your dream job in the exciting world of Databricks in India. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies