Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 11.0 years
20 - 25 Lacs
Mumbai
Work from Office
About This Role Aladdin Data: BlackRock is one of the worlds leading asset management firms and Aladdinis the firms an end-to-end operating system for investment professionals to see their whole portfolio, understand risk exposure, and act with precision Aladdin is our operating platform to manage financial portfolios It unites client data, operators, and technology needed to manage transactions in real time through every step of the investment process, Aladdin Data is at the core of the Aladdin platform, and increasingly, our ability to consume, store, analyze, and gain insight from data is a key component of our competitive advantage Our mission is to deliver critical insights to our stakeholders, enabling them to make data-driven decisions, BlackRocks Data Operations team is at the heart of our data ecosystem, ensuring seamless data pipeline operations across the firm Within this team, the Process Engineering group focusses on building tools to enhance observability, improve operator experience, streamline operations, and provide analytics that drive continuous improvement across the organization, Key Responsibilities Strategic Leadership Drive the roadmap for process engineering initiatives that align with broader Data Operations and enterprise objectives, Partner on efforts to modernize legacy workflows and build scalable, reusable solutions that support operational efficiency, risk reduction, and enhanced observability, Define and track success metrics for operational performance and process health across critical data pipelines, Process Engineering & Solutioning Design and develop tools and products to support operational efficiency, observability, risk management, and KPI tracking, Define success criteria for data operations in collaboration with stakeholders across teams, Break down complex data challenges into scalable, manageable solutions aligned with business needs, Proactively identify operational inefficiencies and deliver data-driven improvements, Data Insights & Visualization Design data science solutions to analyze vendor data trends, identify anomalies, and surface actionable insights for business users and data stewards, Develop and maintain dashboards (e-g, Power BI, Tableau) that provide real-time visibility into vendor data quality, usage patterns, and operational health, Create metrics and KPIs that measure vendor data performance, relevance, and alignment with business needs, Quality Control & Data Governance Build automated QC frameworks and anomaly detection models to validate data integrity across ingestion points, Work with data engineering and governance teams to embed robust validation rules and control checks into pipelines, Reduce manual oversight by building scalable, intelligent solutions that detect, report, and in some cases self-heal data issues, Testing & Quality Assurance Collaborate with data engineering and stewardship teams to validate data integrity throughout ETL processes, Lead the automation of testing frameworks for deploying new datasets or new pipelines, Collaboration & Delivery Work closely with internal and external stakeholders to align technical solutions with business objectives, Communicate effectively with both technical and non-technical teams, Operate in an agile environment, managing multiple priorities and ensuring timely delivery of high-quality data solutions, Experience & Education 8+ years of experience in data engineering, data operations, analytics, or related fields, with at least 3 years in a leadership or senior IC capacity, Bachelor's or Masters degree in a quantitative field (Computer Science, Data Science, Statistics, Engineering, or Finance), Experience working with financial market data providers (e-g, Bloomberg, Refinitiv, MSCI) is highly valued, Proven track record of building and deploying ML models, Technical Expertise Deep proficiency in SQL and Python, with hands-on experience in data visualization (Power BI, Tableau), cloud data platforms (e-g, Snowflake), and Unix-based systems, Exposure to modern frontend frameworks (React JS) and microservices-based architectures is a strong plus, Familiarity with various database systems (Relational, NoSQL, Graph) and scalable data processing techniques, Leadership & Communication Skills Proven ability to lead cross-functional teams and influence without authority in a global matrixed organization, Exceptional communication skills, with a track record of presenting complex technical topics to senior stakeholders and non-technical audiences, Strong organizational and prioritization skills, with a results-oriented mindset and experience in agile project delivery, Preferred Qualifications Certification in Snowflake or equivalent cloud data platforms Certification in Power BI or other analytics tools Experience leading Agile teams and driving enterprise-level transformation initiatives Our Benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about, Our hybrid work model BlackRocks hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week Some business groups may require more time in the office due to their roles and responsibilities We remain focused on increasing the impactful moments that arise when we work together in person aligned with our commitment to performance and innovation As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock, About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being Our clients, and the people they serve, are saving for retirement, paying for their childrens educations, buying homes and starting businesses Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress, This mission would not be possible without our smartest investment the one we make in our employees Its why were dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive, For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: linkedin,com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law,
Posted 3 weeks ago
4.0 - 9.0 years
20 - 30 Lacs
Hyderabad, Pune, Delhi / NCR
Hybrid
Develop applications for investment management and trading/Valuation support using Python, React, SQL , FastAPI , and AWS and LLM Unit Testing Daily Scrums with client Required Candidate profile 5+ years of Python with Flask, FastAPI, etc. Experience with RestAPI development Hands on UI Development in React, Experience with cloud Azure/ AWS Experience with LLM /Langchain and DEvops
Posted 3 weeks ago
2.0 - 5.0 years
10 - 14 Lacs
Bengaluru
Work from Office
At ABB, we are dedicated to addressing global challenges Our core values: care, courage, curiosity, and collaboration combined with a focus on diversity, inclusion, and equal opportunities are key drivers in our aim to empower everyone to create sustainable solutions Write the next chapter of your ABB story, This position reports to HR Analytics Team Lead Your role and responsibilities People Analytics organization is a Global function that works with various ABB business divisions and countries delivering operational and expert services to ABBs HR community We aim to unlock the potential of data to help ABB business leaders and managers take better decisions which will enable us to build a more sustainable and resource-efficient future in electrification and automation In this role, you will have the opportunity to partner with senior stakeholders to help them conceptualize KPIs in areas of Talent Performance Culture and build statistical and analytics solutions from scratch, and have a bias towards creating clean, effective, user-focused visualizations to deliver actionable insights and analysis using technologies that would vary based on purpose from Python, Snowflake, Power BI, Advanced Excel, VBA, or any other new technology The work model for the role is: Hybrid This role is contributing to the People Analytics function supporting various business function based out in Bangalore You will be mainly accountable for: Capably interacting and managing global ABB leadership to seek and provide meaningful and actionable insights in all interactions Responsible for on time delivery of actionable insights by requirement gathering, data extraction to reporting/ presenting the findings in IC role or with the team as per project needs You are to be constantly on the looking out for ways to enhance value for your respective stakeholders/clients Developing frameworks, plug n play solutions using diagnostic, predictive and machine learning techniques on Snowflake/ Python Executing strategic projects to help ABB improve excellence in people, performance, and culture Qualifications For The Role Bachelors/masters degree in applied Statistics/Mathematics, Engineering, Operations Research or related field, At least 3 5 years of experience in consulting, shared services or software development with proficient data analysis techniques using technologies like Excel, VBA Scripting, Python, Snowflake, Understanding of SAP/ Workday HCM/ Snowflake system is preferred, Candidate should have a motivated mindset with advanced quantitative skills, with an ability to work on large datasets The candidate should be able to generate actionable insights from data analysis that translate to valuable business decisions for the client, Capable EDA practitioner with extensive knowledge in advanced Excel functionalities Experience in designing and maintaining dashboards/ reports providing diagnostic and forecasting view using VBA, PowerBI, Qlik, Tableau Adept problem solving and analytical skill and can build story with fragmented data and zeal to learn new technologies and continuous improvement Collaborative worker with excellent collaboration skills required to work in a global virtual work environment: team-oriented, self-motivated and able to lead small to mid-size projects More about us ABB Robotics & Discrete Automation Business area provides robotics, and machine and factory automation including products, software, solutions and services Revenues are generated both from direct sales to end users as well as from indirect sales mainly through system integrators and machine builders abb,com/robotics We value people from different backgrounds Apply today for your next career step within ABB and visit abb,com to learn about the impact of our solutions across the globe #MyABBStory It has come to our attention that the name of ABB is being used for asking candidates to make payments for job opportunities (interviews, offers) Please be advised that ABB makes no such requests All our open positions are made available on our career portal for all fitting the criteria to apply ABB does not charge any fee whatsoever for recruitment process Please do not make payments to any individuals / entities in connection to recruitment with ABB, even if is claimed that the money is refundable ABB is not liable for such transactions For current open positions you can visit our career website https://global abb/group/en/careers and apply Please refer to detailed recruitment fraud caution notice using the link https://global abb / group / en / careers / how-to-apply / fraud-warning
Posted 3 weeks ago
2.0 - 5.0 years
6 - 10 Lacs
Mumbai
Work from Office
About This Role Aladdin Data Introduction: BlackRock is one of the worlds leading asset management firms and Aladdinis the firms an end-to-end operating system for investment professionals to see their whole portfolio, understand risk exposure, and act with precision Aladdin is our operating platform to manage financial portfolios It unites client data, operators, and technology needed to manage transactions in real time through every step of the investment process, Aladdin Data is at the core of the Aladdin platform, and increasingly, our ability to consume, store, analyze, and gain insight from data is a key component of our competitive advantage Our mission is to deliver critical insights to our stakeholders, enabling them to make data-driven decisions, BlackRocks Data Operations team is at the heart of our data ecosystem, ensuring seamless data pipeline operations across the firm Within this team, the Process Engineering group focusses on building tools to enhance observability, improve operator experience, streamline operations, and provide analytics that drive continuous improvement across the organization, Key Responsibilities Process Engineering & Solutioning Design and develop tools and products to support operational efficiency, observability, risk management, and KPI tracking, Define success criteria for data operations in collaboration with stakeholders across teams, Break down complex data challenges into scalable, manageable solutions aligned with business needs, Proactively identify operational inefficiencies and deliver data-driven improvements, Testing & Quality Assurance Collaborate with data engineering and stewardship teams to validate data integrity throughout ETL processes, Ensure compliance with data governance controls during testing and assist in resolving data quality issues, Collaboration & Delivery Work closely with internal and external stakeholders to align technical solutions with business objectives, Communicate effectively with both technical and non-technical teams, Operate in an agile environment, managing multiple priorities and ensuring timely delivery of high-quality data solutions, What Were Looking For 4+ years in data engineering, data operations, data analytics, or related roles, Bachelor's degree in Computer Science, Information Technology, Finance, or a related field, Experience in financial services is a plus, but not required, Technical Skills Strong proficiency in SQL, Python, and data visualization tools (Power BI, Tableau), Experience with Unix environments and modern cloud platforms like Snowflake, Familiarity with frontend frameworks (React JS) is a plus, Understanding of database types (Relational, NoSQL, Graph) and modern data architecture, Familiarity with pipeline monitoring tools and logging frameworks (ELK, Grafana) Soft Skills Strong analytical and problem-solving skills with excellent attention to detail, Ability to clearly communicate complex concepts to varied audiences, Proven organizational skills and experience managing multiple projects in agile environments, Team player with a collaborative mindset, Preferred Qualifications Snowflake Certification Power BI Certification Experience with Agile development methodology Our Benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about, Our hybrid work model BlackRocks hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week Some business groups may require more time in the office due to their roles and responsibilities We remain focused on increasing the impactful moments that arise when we work together in person aligned with our commitment to performance and innovation As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock, About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being Our clients, and the people they serve, are saving for retirement, paying for their childrens educations, buying homes and starting businesses Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress, This mission would not be possible without our smartest investment the one we make in our employees Its why were dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive, For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: linkedin,com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law,
Posted 3 weeks ago
5.0 - 10.0 years
3 - 7 Lacs
Pune, Bengaluru
Hybrid
Key Responsibilities: Design, develop, and maintain data pipelines and integrations using Snowflake. Write efficient and complex SQL queries for data transformation and reporting. Develop automation scripts and tools using Python. Optimize data workflows for performance and scalability. Collaborate with data engineers, analysts, and business stakeholders to understand data needs. Ensure data accuracy, consistency, and security across various data layers.
Posted 3 weeks ago
10.0 - 12.0 years
25 - 27 Lacs
Indore, Hyderabad, Pune
Work from Office
We are seeking a skilled Lead Data Engineer with extensive experience in Snowflake, ADF, SQL, and other relevant data technologies to join our team. As a key member of our data engineering team, you will play an instrumental role in designing, developing, and managing data pipelines, working closely with cross-functional teams to drive the success of our data initiatives. Key Responsibilities: Design, implement, and maintain data solutions using Snowflake, ADF, and SQL Server to ensure data integrity, scalability, and high performance. Lead and contribute to the development of data pipelines, ETL processes, and data integration solutions, ensuring the smooth extraction, transformation, and loading of data from diverse sources. Work with MSBI, SSIS, and Azure Data Lake Storage to optimize data flows and storage solutions. Collaborate with business and technical teams to identify project needs, estimate tasks, and set intermediate milestones to achieve final outcomes. Implement industry best practices related to Business Intelligence and Data Management, ensuring adherence to usability, design, and development standards. Perform in-depth data analysis to resolve data issues and improve overall data quality. Mentor and guide junior data engineers, providing technical expertise and supporting the development of their skills. Effectively collaborate with geographically distributed teams to ensure project goals are met in a timely manner. Required Technical Skills: T-SQL, SQL Server, MSBI (SQL Server Integration Services, Reporting Services), Snowflake, Azure Data Factory (ADF), SSIS, Azure Data Lake Storage. Proficient in designing and developing data pipelines, data integration, and data management workflows. Strong understanding of Cloud Data Solutions, with a focus on Azure-based tools and technologies. Nice to Have: Experience with Power BI for data visualization and reporting. Familiarity with Azure Databricks for data processing and advanced analytics. Mandatory Key Skills Azure Data Lake Storage,Business Intelligence,Data Management,T-SQL,Power BI,Azure Databricks,Cloud Data Solutions,Snowflake*,ADF*,SQL Server*,MSBI*,SSIS*
Posted 3 weeks ago
3.0 - 8.0 years
15 - 18 Lacs
Hyderabad
Work from Office
BI Analyst, WebFocus , SQL, Tableau, Alteryx, Power BI for data visualization & workflow automation, banking/financial services industry exp., ETL processes, cloud BI environments(AWS, Azure), Agile/Scrum, Snowflake, InfoAssist, App Studio, BI Portal
Posted 3 weeks ago
8.0 - 13.0 years
20 - 35 Lacs
Kolkata, Hyderabad, Bengaluru
Hybrid
With a startup spirit and 115,000+ curious and courageous minds, we have the expertise to go deep with the worlds biggest brands—and we have fun doing it. We dream in digital, dare in reality, and reinvent the ways companies work to make an impact far bigger than just our bottom line. We’re harnessing the power of technology and humanity to create meaningful transformation that moves us forward in our pursuit of a world that works better for people. Now, we’re calling upon the thinkers and doers, those with a natural curiosity and a hunger to keep learning, keep growing., People who thrive on fearlessly experimenting, seizing opportunities, and pushing boundaries to turn our vision into reality. And as you help us create a better world, we will help you build your own intellectual firepower. Welcome to the relentless pursuit of better. Inviting applications for the role of Lead Consultant, AWS DataLake! Responsibilities • Having knowledge on DataLake on AWS services with exposure to creating External Tables and spark programming. The person shall be able to work on python programming. • Writing effective and scalable Python codes for automations, data wrangling and ETL. • Designing and implementing robust applications and work on Automations using python codes. • Debugging applications to ensure low-latency and high-availability. • Writing optimized custom SQL queries • Experienced in team and client handling • Having prowess in documentation related to systems, design, and delivery. • Integrate user-facing elements into applications • Having the knowledge of External Tables, Data Lake concepts. • Able to do task allocation, collaborate on status exchanges and getting things to successful closure. • Implement security and data protection solutions • Must be capable of writing SQL queries for validating dashboard outputs • Must be able to translate visual requirements into detailed technical specifications • Well versed in handling Excel, CSV, text, json other unstructured file formats using python. • Expertise in at least one popular Python framework (like Django, Flask or Pyramid) • Good understanding and exposure on any Git, Bamboo, Confluence and Jira. • Good in Dataframes and SQL ANSI using pandas. • Team player, collaborative approach and excellent communication skills Qualifications we seek in you! Minimum Qualifications •BE/B Tech/ MCA •Excellent written and verbal communication skills •Good knowledge of Python, Pyspark Preferred Qualifications/ Skills Strong ETL knowledge on any ETL tool – good to have. Good to have knowledge on AWS cloud and Snowflake. Having knowledge of PySpark is a plus. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. For more information, visit www.genpact.com . Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.
Posted 3 weeks ago
5.0 - 10.0 years
15 - 30 Lacs
Chennai
Hybrid
Key Skills: Snowflake (Snow SQL, Snow PLSQL and Snowpark ) Strong Python Airflow/DBT Any DevOps tools AWS/Azure Cloud Skills Requirements: Looking for engineer for information warehouse Warehouse is based on AWS/Azure, DBT, Snowflake. Strong programming experience with Python. Experience with workflow management tools like Argo/Oozie/Airflow. Experience in Snowflake modelling - roles, schema, databases Experience in data Modeling (Data Vault). Experience in design and development of data transformation pipelines using the DBT framework.
Posted 3 weeks ago
3.0 - 8.0 years
20 - 30 Lacs
Chennai
Hybrid
Job Title: Senior Data Engineer Data Products Location: Chennai, India Open Roles: 2 Mode: Hybrid About the Role Are you a hands-on data engineer who thrives on solving complex data challenges and building modern cloud-native solutions? We're looking for two experienced Senior Data Engineers to join our growing Data Engineering team. This is an exciting opportunity to work on cutting-edge data platform initiatives that power advanced analytics, AI solutions, and digital transformation across a global enterprise. In this role, you'll help design and build reusable, scalable, and secure data pipelines on a multi-cloud infrastructure, while collaborating with cross-functional teams in a highly agile environment. What You’ll Do Design and build robust data pipelines and ETL frameworks using modern tools and cloud platforms. Implement lakehouse architecture (Bronze/Silver/Gold layers) and support data product publishing via Unity Catalog. Work with structured and unstructured enterprise data including ERP, CRM, and product data systems. Optimize pipeline performance, reliability, and security across AWS and Azure environments. Automate infrastructure using IaC tools like Terraform and AWS CDK. Collaborate closely with data scientists, analysts, and platform teams to deliver actionable data products. Participate in agile ceremonies, conduct code reviews, and contribute to team knowledge sharing. Ensure compliance with data privacy, cybersecurity, and governance policies. What You Bring 3+ years of hands-on experience in data engineering roles. Strong command of SQL and Python ; experience with Scala is a plus. Proficiency in cloud platforms (AWS, Azure), Databricks , DBT , Airflow , and version control tools like GitLab . Hands-on experience implementing lakehouse architectures and multi-hop data flows using Delta Lake . Background in working with enterprise data systems like SAP, Salesforce, and other business-critical platforms. Familiarity with DevOps , DataOps , and agile delivery methods (Jira, Confluence). Strong understanding of data security , privacy compliance , and production-grade pipeline management. Excellent communication skills and ability to work in global, multicultural teams. Why Join Us? Opportunity to work with modern data technologies in a complex, enterprise-scale environment. Be part of a collaborative, forward-thinking team that values innovation and continuous learning. Hybrid work model that offers both flexibility and team engagement . A role where you can make a real impact by contributing to digital transformation and data-driven decision-making.
Posted 3 weeks ago
7.0 - 10.0 years
20 - 25 Lacs
Hyderabad, Pune
Work from Office
Overall Experience : 5+ years of experience in IICS and Snowflake Proven experience in implementing ETL solutions with a focus on IICS and Snowflake. Strong hands-on development experience in IICS, including CDI, CAI, and Mass Ingestion. Proficiency in using various connectors for different source/file formats and databases. Knowledge of web services and their integration into ETL processes. Administration skills related to IICS, ensuring a smooth operational environment. Proven experience as a Snowflake Developer with a strong focus on data warehousing. Hands-on experience in designing and implementing data models within the Snowflake environment. Proficient in developing ETL processes using Snowflake features and SQL. Knowledge of security best practices and access control within Snowflake. Familiarity with data integration and data warehouse concepts. Experience with data migration to Snowflake from other platforms is a plus. Excellent problem-solving skills with the ability to analyze and resolve critical issues. Strong organizational and project management skills. Effective communication skills for customer interactions and status updates. Ability to thrive in a fast-paced, dynamic, client-facing role where delivering solid work products to exceed high expectations is a measure of success. Eager to contribute to a team-oriented environment. Strong prioritization and multi-tasking skills with a track record of meeting deadlines. Ability to be creative and analytical in a problem-solving environment. Effective verbal and written communication skills. Adaptable to new environments, people, technologies, and processes Ability to manage ambiguity and solve undefined problems.
Posted 3 weeks ago
4.0 - 6.0 years
8 - 13 Lacs
Pune
Work from Office
Horizon Job: ETL Developer OR DWH/BI Developer Job Seniority: Advanced (4-6 years) OR Experienced (3-4 yrs) Location : Magarpatta City ,Pune (Hybrid ) Unit : Amdocs Data and Intelligence Technical Skills: All mandatory exp must be in the resume in the roles and responsibilities Mandatory Working experience in Azure Databricks/ Pyspark. Expert knowledge in Oracle/SQL - Ability to write complex SQL/PL-SQL & performance tune. Have 2+ Year experience in Snowflake Have 2+years of hands-on experience in Spark or DataBricks to build data pipelines. Strong experience on Cloud technologies Have 1+years of hands-on experience in Development, Performance Tuning and loading into Snowflake. Experience of working with Azure Repos or Github. Have 1+years of hands-on experience of working with Azure DevOps or GitHub or any other DevOps tool. Hands on in Unix & advanced Unix Shell Scripting . Open to work in shift. NP: immediate to 1 m Excellent Communication Skills This is C2H Opportunity Interested Candidate Share Resume a t dipti.bhaisare@in.experis.com
Posted 3 weeks ago
3.0 - 6.0 years
12 - 22 Lacs
Bengaluru, Delhi / NCR
Hybrid
Role & responsibilities Bachelors degree in computer science, Engineering, or related field. •3+ years of experience in software development. •Experience with cloud platforms (e.g., AWS, Azure). •Proficiency in Kafka for real-time data streaming. •Strong experience with Snowflake for data warehousing. •Expertise in Terraform for infrastructure automation. •Solid knowledge of .NET framework for application development. •Extensive experience with SQL Server for database management. •Proficiency in SSIS for ETL processes. •Familiarity with Agile development methodologies.
Posted 3 weeks ago
3.0 - 8.0 years
16 - 18 Lacs
Hyderabad
Work from Office
We are Hiring Data Management Specialist Level 2 for a US based IT Compnay based in Hyderabad. Job Title : Data Management Specialist Level 2 Location : Hyderabad Experience : 3+ Years CTC : 16 LPA - 18 LPA Working shift : Day shift We are seeking a Level 2 Data Management Specialist to join our data team and support the development, maintenance, and optimization of data pipelines and cloud-based data platforms. The ideal candidate will have hands-on experience with Snowflake , along with a solid foundation in SQL , data integration, and cloud data technologies. As a mid-level contributor, this position will collaborate closely with senior data engineers and business analysts to deliver reliable, high-quality data solutions for reporting, analytics, and operational needs. You will help develop scalable data workflows, resolve data quality issues, and ensure compliance with data governance practices. Key Responsibilities: Design, build, and maintain scalable data pipelines using Snowflake and SQL-based transformation logic Assist in developing and optimizing data models to support reporting and business intelligence efforts Write efficient SQL queries for data extraction, transformation, and analysis Collaborate with cross-functional teams to gather data requirements and implement dependable data solutions Support data quality checks and validation procedures to ensure data integrity and consistency Contribute to data integration tasks across various sources, including relational databases and cloud storage Document technical workflows, data definitions, and transformation logic for reference and compliance Monitor the performance of data processes and help troubleshoot workflow issues Required Skills & Qualifications: 24 years of experience in data engineering or data management roles Proficiency in Snowflake for data development or analytics Strong SQL skills and a solid grasp of relational database concepts Familiarity with ETL/ELT tools such as Informatica, Talend , or dbt Basic understanding of cloud platforms like AWS, Azure , or GCP Knowledge of data modeling techniques (e.g., star and snowflake schemas) Excellent attention to detail, strong analytical thinking, and problem-solving skills Effective team player with the ability to clearly communicate technical concepts Preferred Skills: Exposure to data governance or data quality frameworks Experience working in the banking or financial services industry Basic scripting skills in Python or Shell Familiarity with Agile/Scrum methodologies Experience using Git or other version control tools For further assistance contact/whatsapp : 9354909521 9354909512 or write to pankhuri@gist.org.in
Posted 3 weeks ago
4.0 - 8.0 years
15 - 27 Lacs
Chennai
Work from Office
We're Nagarro , We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That is where you come in! REQUIREMENTS: Total experience 4+ years. Excellent knowledge and experience in Big data engineer. Strong experience with AWS services , especially S3, Glue, Athena, and EMR. Hands-on programming experience in Python, Spark, SQL, and Talend . Proficiency in working with data warehouses such as Amazon Redshift and Snowflake . Experience handling structured and semi-structured data. Strong understanding of ETL/ELT processes and data transformation techniques. Proven experience in cross-functional collaboration with technical and business teams. Familiarity with data modeling, data warehousing, and building distributed systems. Expertise in Spanner for high-availability, scalable database solutions. Knowledge of data governance and security practices in cloud-based environments. Problem-solving mindset with the ability to tackle complex data engineering challenges. Strong communication and teamwork skills, with the ability to mentor and collaborate effectively. Experience with creating technical documentation and solution designs. RESPONSIBILITIES: Writing and reviewing great quality code Understanding the clients business use cases and technical requirements and be able to convert them in to technical design which elegantly meets the requirements Mapping decisions with requirements and be able to translate the same to developers Identifying different solutions and being able to narrow down the best option that meets the clients requirements Defining guidelines and benchmarks for NFR considerations during project implementation Writing and reviewing design document explaining overall architecture, framework, and high-level design of the application for the developers Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it Understanding and relating technology integration scenarios and applying these learnings in projects Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken Carrying out POCs to make sure that suggested design/technologies meet the requirements
Posted 3 weeks ago
5.0 - 7.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Job Title: Senior Data Engineer / Technical Lead Location: Bangalore Employment Type: Full-time Role Summary We are seeking a highly skilled and motivated Senior Data Engineer/Technical Lead to take ownership of the end-to-end delivery of a key project involving data lake transitions, data warehouse maintenance, and enhancement initiatives. The ideal candidate will bring strong technical leadership, excellent communication skills, and hands-on expertise with modern data engineering tools and platforms. Experience in Databricks and JIRA is highly desirable. Knowledge of supply chain and finance domains is a plus, or a willingness to quickly ramp up in these areas is expected. Key Responsibilities Delivery Management Lead and manage data lake transition initiatives under the Gold framework. Oversee delivery of enhancements and defect fixes related to the enterprise data warehouse. Technical Leadership Design and develop efficient, scalable data pipelines using Python, PySpark , and SQL . Ensure adherence to coding standards, performance benchmarks, and data quality goals. Conduct performance tuning and infrastructure optimization for data solutions. Provide code reviews, mentorship, and technical guidance to the engineering team. Collaboration & Stakeholder Engagement Collaborate with business stakeholders (particularly the Laboratory Products team) to gather, interpret, and refine requirements. Communicate technical solutions and project progress clearly to both technical and non-technical audiences. Tooling and Technology Use Leverage tools such as Databricks, Informatica, AWS Glue, Google DataProc , and Airflow for ETL and data integration. Use JIRA to manage project workflows, track defects, and report progress. Documentation and Best Practices Create and review documentation including architecture, design, testing, and deployment artifacts. Define and promote reusable templates, checklists, and best practices for data engineering tasks. Domain Adaptation Apply or gain knowledge in supply chain and finance domains to enhance project outcomes and align with business needs. Skills and Qualifications Technical Proficiency Strong hands-on experience in Python, PySpark , and SQL . Expertise with ETL tools such as Informatica, AWS Glue, Databricks , and Google Cloud DataProc . Deep understanding of data warehousing solutions (e.g., Snowflake, BigQuery, Delta Lake, Lakehouse architectures ). Familiarity with performance tuning, cost optimization, and data modeling best practices. Platform & Tools Proficient in working with cloud platforms like AWS, Azure, or Google Cloud . Experience in version control and configuration management practices. Working knowledge of JIRA and Agile methodologies. Certifications (Preferred but not required) Certifications in cloud technologies, ETL platforms, or relevant domain (e.g., AWS Data Engineer, Databricks Data Engineer, Supply Chain certification). Expected Outcomes Timely and high-quality delivery of data engineering solutions. Reduction in production defects and improved pipeline performance. Increased team efficiency through reuse of components and automation. Positive stakeholder feedback and high team engagement. Consistent adherence to SLAs, security policies, and compliance guidelines. Performance Metrics Adherence to project timelines and engineering standards Reduction in post-release defects and production issues Improvement in data pipeline efficiency and resource utilization Resolution time for pipeline failures and data issues Completion of required certifications and training Preferred Background Background or exposure to supply chain or finance domains Willingness to work during morning US East hours Ability to work independently and drive initiatives with minimal oversight Required Skills Databricks,Data Warehousing,ETL,SQL
Posted 3 weeks ago
5.0 - 10.0 years
10 - 20 Lacs
Pune, Chennai
Hybrid
Snowflake + SQL 5 to 15 Yrs Pune /Chennai If shortlisted candidate should be available for F2F interview Pune & Chennai
Posted 3 weeks ago
5.0 - 10.0 years
15 - 25 Lacs
Pune
Hybrid
About You You are a self-motivated, proactive individual who thrives in a fast-paced environment. You have a strong eagerness to learn and grow, continuously staying updated with the latest trends and technologies in data engineering. Your passion for collaboration makes you a valuable team player, contributing to a positive work culture while also guiding and mentoring junior team members. Youre excited about problem-solving and have the ability to take ownership of projects from start to finish. With a keen interest in data-driven decision-making, you are ready to work on cutting-edge solutions that have a direct impact on the business. Role and Responsibilities As a Data Engineer, you will play a crucial role in leading and managing strategic data initiatives across the business. Your responsibilities will include: Leading data engineering projects across key business functions, including Marketing, Sales, Customer Success, and Product R&D. Developing and maintaining data pipelines to extract, transform, and load (ETL) data into data warehouses or data lakes. Designing and implementing ETL processes, ensuring the integrity, scalability, and performance of the data architecture. Leading data modeling efforts, ensuring that data is structured for optimal performance and that security best practices are maintained. Collaborating with data scientists, analysts, and stakeholders to understand data requirements and provide valuable insights across the customer journey. Guiding and mentoring junior engineers, providing technical leadership and ensuring best practices are followed. Maintaining documentation for data structures, ETL processes, and data lineage, ensuring clarity and ease of understanding across the team. Developing and maintaining data security, compliance, and retention protocols as part of best practice initiatives. Professional Expertise Must-Have Skills: 5+ years of experience in data engineering, data warehousing, and building enterprise-level data integrations. Proficiency in SQL, including query optimization and tuning for relational databases (Snowflake, MS SQL Server, RedShift, etc.). 2+ years of experience working with cloud platforms (AWS, GCP, Azure, or OCI). Expertise in Python and Spark for data extraction, manipulation, and data pipeline development. Experience with structured, semi-structured, and unstructured data formats (JSON, XML, Parquet, CSV). Familiarity with version control systems (Git, Bitbucket) and Agile methodologies (Jira). Ability to collaborate with data scientists and business analysts, providing data support and insights. Proven ability to work effectively in a team setting, balancing multiple projects, and leading initiatives. Nice-to-Have Skills: Experience in the SaaS software industry. Knowledge of analytics governance, data literacy, and core visualization tools (Tableau, MicroStrategy). Familiarity with CRM and marketing automation tools (Salesforce, HubSpot, Eloqua). Education Bachelors or masters degree in computer science, Information Systems, or a related field (Advanced degree preferred).
Posted 3 weeks ago
5.0 - 7.0 years
7 - 9 Lacs
Pune
Work from Office
New Opportunity :FullStack Engineer. Location :Pune (Onsite). Company :Apptware Solutions Hiring. Experience :4+ years. We're looking for a skilled Full Stack Engineer to join our team. If you have experience in building scalable applications and working with modern technologies, this role is for you. Role & Responsibilities. Develop product features to help customers easily transform data. Design, implement, deploy, and support client-side and server-side architectures, including web applications, CLI, and SDKs. Minimum Requirements. 4+ years of experience as a Full Stack Developer or similar role. Hands-on experience in a distributed engineering role with direct operational responsibility (on-call experience preferred). Proficiency in at least one back-end language (Node.js, TypeScript, Python, or Go). Front-end development experience with Angular or React, HTML, CSS. Strong understanding of web applications, backend APIs, CI/CD pipelines, and testing frameworks. Familiarity with NoSQL databases (e.g. DynamoDB) and AWS services (Lambda, API Gateway, Cognito, etc.). Bachelor's degree in Computer Science, Engineering, Math, or equivalent experience. Strong written and verbal communication skills. Preferred Skills. Experience with AWS Glue, Spark, or Athena. Strong understanding of SQL and data engineering best practices. Exposure to Analytical EDWs (Snowflake, Databricks, Big Query, Cloudera, Teradata). Experience in B2B applications, SaaS offerings, or startups is a plus. (ref:hirist.tech). Show more Show less
Posted 3 weeks ago
9.0 - 14.0 years
25 - 40 Lacs
Bengaluru
Hybrid
Greetings from tsworks Technologies India Pvt . We are hiring for Sr. Data Engineer - Snowflake with AWS If you are interested, please share your CV to mohan.kumar@tsworks.io Position: Senior Data Engineer Experience: 9+ Years Location: Bengaluru, India (Hybrid) Mandatory Required Qualification Strong proficiency in AWS data services such as S3 buckets, Glue and Glue Catalog, EMR, Athena, Redshift, DynamoDB, Quick Sight, etc. Strong hands-on experience building Data Lake-House solutions on Snowflake, and using features such as streams, tasks, dynamic tables, data masking, data exchange etc. Hands-on experience using scheduling tools such as Apache Airflow, DBT, AWS Step Functions and data governance products such as Collibra Expertise in DevOps and CI/CD implementation Excellent Communication Skills In This Role, You Will Design, implement, and manage scalable and efficient data architecture on the AWS cloud platform. Develop and maintain data pipelines for efficient data extraction, transformation, and loading (ETL) processes. Perform complex data transformations and processing using PySpark (AWS Glue, EMR or Databricks), Snowflake's data processing capabilities, or other relevant tools. Hands-on experience working with Data Lake solutions such as Apache Hudi, Delta Lake or Iceberg. Develop and maintain data models within Snowflake and related tools to support reporting, analytics, and business intelligence needs. Collaborate with cross-functional teams to understand data requirements and design appropriate data integration solutions. Integrate data from various sources, both internal and external, ensuring data quality and consistency. Skills & Knowledge Bachelor's degree in computer science, Engineering, or a related field. 9 + Years of experience in Information Technology, designing, developing and executing solutions. 4+ Years of hands-on experience in designing and executing data solutions on AWS and Snowflake cloud platforms as a Data Engineer. Strong proficiency in AWS services such as Glue, EMR, Athena, Databricks, with file formats such as Parquet and Avro. Hands-on experience in data modelling, batch and real-time pipelines, using Python, Java or JavaScript and experience working with Restful APIs are required. Hands-on experience in handling real-time data streams from Kafka or Kinesis is required. Expertise in DevOps and CI/CD implementation. Hands-on experience with SQL and NoSQL databases. Hands-on experience in data modelling, implementation, and management of OLTP and OLAP systems. Knowledge of data quality, governance, and security best practices. Familiarity with machine learning concepts and integration of ML pipelines into data workflows Hands-on experience working in an Agile setting. Is self-driven, naturally curious, and able to adapt to a fast-paced work environment. Can articulate, create, and maintain technical and non-technical documentation. AWS and Snowflake Certifications are preferred.
Posted 3 weeks ago
9.0 - 14.0 years
25 - 40 Lacs
Bengaluru
Hybrid
Greetings from tsworks Technologies India Pvt . we are hiring for Sr. Data Engineer - Snowflake with Azure, if you are interested please share your CV to mohan.kumar@tsworks.io Position: Senior Data Engineer Experience: 10+ Years Location: Bengaluru, India (Hybrid) Mandatory Required Qualification Strong proficiency in Azure data services such as Azure Data Lake Storage Gen 2, Azure Data Factory, Azure Databricks, Fabric Analytics, Azure EventHub, Azure Function App etc. Strong hands-on experience building Data Lake-House solutions on Snowflake, and using features such as streams, tasks, dynamic tables, data masking, data exchange etc. Hands-on experience using scheduling and orchestration tools such as Apache Airflow, DBT and data governance products such as Collibra Expertise in DevOps and CI/CD implementation Excellent Communication Skills In This Role, You Will Design, implement, and manage scalable and efficient data architecture on the Azure cloud platform. Develop and maintain data pipelines for efficient data extraction, transformation, and loading (ETL) processes. Perform complex data transformations and processing using PySpark (Azure Fabric or Databricks), Snowflake's data processing capabilities, or other relevant tools. Hands-on experience working with Data Lake solutions such as Apache Hudi, Delta Lake or Iceberg. Develop and maintain data models within Snowflake and related tools to support reporting, analytics, and business intelligence needs. Collaborate with cross-functional teams to understand data requirements and design appropriate data integration solutions. Integrate data from various sources, both internal and external, ensuring data quality and consistency. Ensure data models are designed for scalability, reusability, and flexibility. Implement data quality checks, validations, and monitoring processes to ensure data accuracy and integrity across Azure and Snowflake environments. Adhere to data governance standards and best practices to maintain data security and compliance. Handling performance optimization in Azure and Snowflake platforms Collaborate with data scientists, analysts, and business stakeholders to understand data needs and deliver actionable insights. Provide guidance and mentorship to junior team members to enhance their technical skills. Maintain comprehensive documentation for data pipelines, processes, and architecture within both Azure and Snowflake environments including best practices, standards, and procedures. Skills & Knowledge Bachelor's degree in computer science, Engineering, or a related field. 10 + Years of experience in Information Technology, designing, developing and executing solutions. 4+ Years of hands-on experience in designing and executing data solutions on Azure and Snowflake cloud platforms as a Data Engineer. In-depth knowledge of Microsoft Azure and Fabric platforms and their data services. Strong proficiency in PySpark with Fabric or Databricks, with file formats such as Parquet and Avro. Proficient in using different features of Snowflake such as Snowpipe Streaming, Streams and Tasks, managing different virtual warehouses and ensuring cost optimization, Notebooks, Native Applications, Streamlit, Data Exchange etc. Proficiency using Relational and NoSQL databases. Hands-on experience on with streaming technologies such as Kafka and Flink. Strong understanding of data warehousing, data lakes, and data integration paradigms such as the Medallion Architecture. Knowledge of data quality, governance, and security best practices. Knowledge of data modeling and data governance using Snowflake and external tools such as Collibra or Microsoft Purview. Hands-on experience with Snowflake Cortex AI features. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Knowledge of data security and compliance standards. Hands-on experience working in an agile setting. Self - driven, naturally curious, and able to adapt to a fast-paced work environment. Be able to articulate, create, and maintain technical and non-technical documentation. Certifications such as Snow Pro Core, Microsoft Certified Azure Data Engineer or Microsoft Certified Fabric Data Engineer is a preferred.
Posted 3 weeks ago
8.0 - 13.0 years
30 - 35 Lacs
Hyderabad
Work from Office
The Impact you will have in this role: The Enterprise Intelligence Lead will be responsible for building data pipelines using their deep knowledge of Talend, SQL and Data Analysis on the bespoke Snowflake data warehouse for Enterprise Intelligence; This role will be in the Claw Team within Enterprise Data & Corporate Technology (EDCT). The Enterprise Intelligence team maintains the firms business intelligence tools and data warehouse. Your Primary Responsibilities: Working on and leading engineering and development focused projects from start to finish with minimal supervision Providing technical and operational support for our customer base as well as other technical areas within the company that utilize Claw Risk management functions such as reconciliation of vulnerabilities, security baselines as well as other risk and audit related objectives Administrative functions for our tools such as keeping the tool documentation current and handling service requests Participate in user training to increase awareness of Claw Ensuring incident, problem and change tickets are addressed in a timely fashion, as well as escalating technical and managerial issues Following DTCCs ITIL process for incident, change and problem resolution Qualifications: Minimum of 8 years of related experience Bachelor's degree preferred or equivalent experience. Talents Needed for Success: Must have experience in snowflake or SQL Minimum of 5 years of related data warehousing work experience 5+ years managing data warehouses in a production environment. This includes all phases of lifecycle management: planning, design, deployment, upkeep and retirement Strong understanding of star/snowflake schemas and data integration methods and tools Moderate to advanced competency of Windows and Unix-like operating system principles Developed competencies around essential project management, communication (oral, written) and personal effectiveness Working experience in MS Office tools such as Outlook, Excel, PowerPoint, Visio and Project Optimize/Tune source streams, queries, Powerbase Dashboards Good knowledge of the technical components of Claw (i.e. Snowflake, Talend, PowerBI, PowerShell, Autosys)
Posted 3 weeks ago
3.0 - 8.0 years
10 - 20 Lacs
Chennai
Hybrid
Roles & Responsibilities : • We are looking for a strong Senior Data Engineering who will be majorly responsible for designing, building and maintaining ETL/ ELT pipelines . • Integration of data from multiple sources or vendors to provide the holistic insights from data. • You are expected to build and manage Data Lake and Data warehouse solutions, design data models, create ETL processes, implementing data quality mechanisms etc. • Perform EDA (exploratory data analysis) required to troubleshoot data related issues and assist in the resolution of data issues. • Should have experience in client interaction oral and written. • Experience in mentoring juniors and providing required guidance to the team. Required Technical Skills • Extensive experience in languages such as Python, Pyspark, SQL (basics and advanced). • Strong experience in Data Warehouse, ETL, Data Modelling, building ETL Pipelines, Data Architecture . • Must be proficient in Redshift, Azure Data Factory, Snowflake etc. • Hands-on experience in cloud services like AWS S3, Glue, Lambda, CloudWatch, Athena etc. • Good to have knowledge in Dataiku, Big Data Technologies and basic knowledge of BI tools like Power BI, Tableau etc will be plus. • Sound knowledge in Data management, data operations, data quality and data governance. • Knowledge of SFDC, Waterfall/ Agile methodology. • Strong knowledge of Pharma domain / life sciences commercial data operations. Qualifications • Bachelors or masters Engineering/ MCA or equivalent degree. • 4-6 years of relevant industry experience as Data Engineer . • Experience working on Pharma syndicated data such as IQVIA, Veeva, Symphony; Claims, CRM, Sales, Open Data etc. • High motivation, good work ethic, maturity, self-organized and personal initiative. • Ability to work collaboratively and providing the support to the team. • Excellent written and verbal communication skills. • Strong analytical and problem-solving skills. Location • Chennai, India
Posted 3 weeks ago
1.0 - 5.0 years
5 - 9 Lacs
Pune
Work from Office
As a global leader in cybersecurity, CrowdStrike protects the people, processes and technologies that drive modern organizations Since 2011, our mission hasnt changed "” were here to stop breaches, and weve redefined modern security with the worlds most advanced AI-native platform Our customers span all industries, and they count on CrowdStrike to keep their businesses running, their communities safe and their lives moving forward Were also a mission-driven company We cultivate a culture that gives every CrowdStriker both the flexibility and autonomy to own their careers Were always looking to add talented CrowdStrikers to the team who have limitless passion, a relentless focus on innovation and a fanatical commitment to our customers, our community and each other Ready to join a mission that mattersThe future of cybersecurity starts with you. About The Role The overall objective for this position is to have a strong IT technical engineering background in order to ensure successful agile development and project delivery in the Enterprise Integration landscape The current Enterprise Integration landscape is in front of significant transformation, requiring a candidate with the right mindset to lead towards the renewed landscape To be successful in this role the candidate must be able to work on multiple projects simultaneously and be able to review the requirements with a critical mindset to secure latest solution design fitting the overall Enterprise architecture The candidate is expected to be able to ensure all business requirements are managed appropriately and deliverables are implemented The candidate must also be able to manage changing priorities working with the team onsite. What You'll Do Close Cooperation with the onsite team leaders and team to deliver high quality, standards based and prioritised delivery items Assist the onsite team to translate business requirements into IT technical requirements Convert the IT technical requirements into solution design, implement, test and deliver the solutions Develop, maintain, and support the solutions delivered to Production Inspect and verify the quality of the releases before delivery Provide recommendations within IT and to the business on solutions Responsible for the architecture, design, field mapping and system documentation Supports solution deployments and executes testing and provides training to key users Self organized and cross-functional Reporting the progress of Sprint to the Manager and the team onsite Accountable for solution design of the IT Service, aligned with Solution Architecture and IT service Owner What You'll Need 7+ years of application integrations development as well as migration experience required Proven Integration skills in Rest API, JSON, XML, Webservices and related technologies IPAAS or ETL experience is a huge plus 4+ years of solid hands on as well design and practically well versed with api-led methodology over Mulesoft version 4.x, MuleSoft ESB, Cloudhub, Anypoint platform 4+ years of API and web services development experience and implementing different Integration patterns using MuleSoft Proven experience in using CI/CD with Cloudhub and Anypoint platform to manage end to end deployment lifecycle Experience in integrating applications like Salesforce, NetSuite, SAP, Workday, Snowflake etc. Experience with writing SQL queries and experience with Snowflake is a plus Ready to do POC, agile, and deep hunger to understand integrations with SaaS applications Experience in working with Agile methodology Customer and result oriented approach Excellent written/verbal communication skills with a proven ability to effectively communicate with customers and partners- Business English mandatory Shift Timings2:00 PM 11:00 PM (IST) Location Remote Benefits Of Working At CrowdStrike Remote-friendly and flexible work culture Market leader in compensation and equity awards Comprehensive physical and mental wellness programs Competitive vacation and holidays for recharge Paid parental and adoption leaves Professional development opportunities for all employees regardless of level or role s, geographic neighbourhood groups and volunteer opportunities to build connections Vibrant office culture with world class amenities Great Place to Work Certified„¢ across the globe CrowdStrike is proud to be an equal opportunity employer We are committed to fostering a culture of belonging where everyone is valued for who they are and empowered to succeed We support veterans and individuals with disabilities through our affirmative action program. CrowdStrike is committed to providing equal employment opportunity for all employees and applicants for employment The Company does not discriminate in employment opportunities or practices on the basis of race, color, creed, ethnicity, religion, sex (including pregnancy or pregnancy-related medical conditions), sexual orientation, gender identity, marital or family status, veteran status, age, national origin, ancestry, physical disability (including HIV and AIDS), mental disability, medical condition, genetic information, membership or activity in a local human rights commission, status with regard to public assistance, or any other characteristic protected by law We base all employment decisions--including recruitment, selection, training, compensation, benefits, discipline, promotions, transfers, lay-offs, return from lay-off, terminations and social/recreational programs--on valid job requirements. If you need assistance accessing or reviewing the information on this website or need help submitting an application for employment or requesting an accommodation, please contact us at recruiting@crowdstrike.com for further assistance. Show more Show less
Posted 3 weeks ago
1.0 - 5.0 years
6 - 10 Lacs
Pune
Work from Office
We're HiringPython Developer! We are looking for an experienced Python Developer to join dynamic team in Pune, India The ideal candidate will possess a strong background in software development and be proficient in writing efficient, reusable code You will play a key role in designing and implementing scalable applications while collaborating with cross-functional teams. “ LocationPune, India Work ModeHybrid ’ RolePython Developer Experience5+ years What Were Looking For Proven experience designing, building, and operating data-oriented solutions in a high volume, transactional, global, industry Experience with advertising technology (AdTech) highly desired. Proven experience developing simple / scalable / reliable architectures, building, and operating concurrent, distributed systems, and solving difficult and novel problems Proven experience in developing data structures and algorithms, including experience working with ML/AI solutions. Proven experience and a passion for developing and operating data-oriented and/or full stack solutions using Python, Javascript/Typescript, Airflow/Composer, Node, Kafka, Snowflake, BigQuery, and a mix of data platforms such as Spark, Hadoop, AWS Athena, Postgres and Redis Excellent SQL development, query optimization and data pipeline development skills required Strong experience using public cloud platforms including AWS and GCP is required; experience with docker and Kubernetes strongly preferred. Proven experience in developing data structures and algorithms Experience supporting ML/AI highly desirable. Proven experience in modern software development and testing practices, with a willingness to share, partner and support and coach other engineers, product people, and operations Experience in employing TDD, BDD or ATDD highly desirable. Proven experience contributing to the development of principles, practices, and tooling supporting agile, testing/QA, DevSecOps, automation, SRE Experience in Trunk Based Development, XP, & implementing CI/CD highly desirable. Experience in SaaS product engineering and operations highly desirable. A focus on continuous learning and improving, both technically and professionally, in your industry, for you and your teams. Demonstrated resilience, with experience working in ambiguous situations. What You'll Do Develop software as a member of one of our engineering teams, participating in all stages of development, delivery and operations, together with your tech lead, colleagues, Product, Data Science, and Design leaders. Develop solutions that are simple, scalable, reliable, secure, maintainable, and make a measurable impact. Develop and deliver new features, maintain our product, and drive growth to hit team KPIs. Employ modern pragmatic engineering principles, practices, and tooling, including TDD/BDD/ATDD, XP, QA Engineering, Trunk Based Development, Continuous Delivery, automation, DevSecOps, and Site Reliability Engineering. Contribute to driving ongoing improvements to our engineering principles, practices, and tooling Provide support & mentorship to junior engineers, prioritising continuous learning and development. Develop and maintain a contemporary understanding of AdTech developments, industry standards, partner and competitor platform developments, and commercial models, from an engineering perspective Combined these insights with technical expertise to contribute to our strategy and plans, influence product design, shape our roadmap, and help plan delivery. Ready to take your career to the next levelš" Apply now and join us on this exciting journey! Show more Show less
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Snowflake has become one of the most sought-after skills in the tech industry, with a growing demand for professionals who are proficient in handling data warehousing and analytics using this cloud-based platform. In India, the job market for Snowflake roles is flourishing, offering numerous opportunities for job seekers with the right skill set.
These cities are known for their thriving tech industries and have a high demand for Snowflake professionals.
The average salary range for Snowflake professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum
A typical career path in Snowflake may include roles such as: - Junior Snowflake Developer - Snowflake Developer - Senior Snowflake Developer - Snowflake Architect - Snowflake Consultant - Snowflake Administrator
In addition to expertise in Snowflake, professionals in this field are often expected to have knowledge in: - SQL - Data warehousing concepts - ETL tools - Cloud platforms (AWS, Azure, GCP) - Database management
As you explore opportunities in the Snowflake job market in India, remember to showcase your expertise in handling data analytics and warehousing using this powerful platform. Prepare thoroughly for interviews, demonstrate your skills confidently, and keep abreast of the latest developments in Snowflake to stay competitive in the tech industry. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.