Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
3.0 - 7.0 years
12 - 17 Lacs
New Delhi, Chennai, Bengaluru
Hybrid
Your day at NTT DATA We are seeking an experienced Data Architect to join our team in designing and delivering innovative data solutions to clients. The successful candidate will be responsible for architecting, developing, and implementing data management solutions and data architectures for various industries. This role requires strong technical expertise, excellent problem-solving skills, and the ability to work effectively with clients and internal teams to design and deploy scalable, secure, and efficient data solutions. What you'll be doing We are seeking an experienced Data Architect to join our team in designing and delivering innovative data solutions to clients. The successful candidate will be responsible for architecting, developing, and implementing data management solutions and data architectures for various industries. This role requires strong technical expertise, excellent problem-solving skills, and the ability to work effectively with clients and internal teams to design and deploy scalable, secure, and efficient data solutions. Experience and Leadership: Proven experience in data architecture, with a recent role as a Lead Data Solutions Architect, or a similar senior position in the field. Proven experience in leading architectural design and strategy for complex data solutions and then overseeing their delivery. Experience in consulting roles, delivering custom data architecture solutions across various industries. Architectural Expertise: Strong expertise in designing and overseeing delivery of data streaming and event-driven architectures, with a focus on Kafka and Confluent platforms. In-depth knowledge in architecting and implementing data lakes and lakehouse platforms, including experience with Databricks and Unity Catalog. Proficiency in conceptualising and applying Data Mesh and Data Fabric architectural patterns. Experience in developing data product strategies, with a strong inclination towards a product-led approach in data solution architecture. Extensive familiarity with cloud data architecture on platforms such as AWS, Azure, GCP, and Snowflake. Understanding of cloud platform infrastructure and its impact on data architecture. Data Technology Skills: A solid understanding of big data technologies such as Apache Spark, and knowledge of Hadoop ecosystems. Knowledge of programming languages such as Python or R is beneficial. Exposure to ETL/ ELT processes, SQL, NoSQL databases is a nice-to-have, providing a well-rounded background. Experience with data visualization tools and DevOps principles/tools is advantageous. Familiarity with machine learning and AI concepts, particularly in how they integrate into data architectures. Design and Lifecycle Management: Proven background in designing modern, scalable, and robust data architectures. Comprehensive grasp of the data architecture lifecycle, from concept to deployment and consumption. Data Management and Governance: Strong knowledge of data management principles and best practices, including data governance frameworks. Experience with data security and compliance regulations (GDPR, CCPA, HIPAA, etc.) Leadership and Communication: Exceptional leadership skills to manage and guide a team of architects and technical experts. Excellent communication and interpersonal skills, with a proven ability to influence architectural decisions with clients and guide best practices Project and Stakeholder Management: Experience with agile methodologies (e.g. SAFe, Scrum, Kanban) in the context of architectural projects. Ability to manage project budgets, timelines, and resources, maintaining focus on architectural deliverables. Location: Delhi or Bangalore Workplace type : Hybrid Working
Posted 1 week ago
9.0 - 14.0 years
10 - 20 Lacs
Chennai, Bengaluru, Mumbai (All Areas)
Hybrid
JD: Snowflake Implementer : Designing, implementing, and managing Snowflake data warehouse solutions, ensuring data integrity, and optimizing performance for clients or internal teams. Strong SQL skills: Expertise in writing, optimizing, and troubleshooting SQL queries. Experience with data warehousing: Understanding of data warehousing concepts, principles, and best practices. Knowledge of ETL /ELT technologies: Experience with tools and techniques for data extraction, transformation, and loading. Experience with data modeling: Ability to design and implement data models that meet business requirements. Familiarity with cloud platforms: Experience with cloud platforms like AWS, Azure, or GCP (depending on the specific Snowflake environment). Problem-solving and analytical skills: Ability to identify, diagnose, and resolve technical issues. Communication and collaboration skills: Ability to work effectively with cross-functional teams. Experience with Snowflake (preferred): Prior experience with Snowflake is highly desirable. Certifications (preferred): Snowflake certifications (e.g., Snowflake Data Engineer, Snowflake Database Administrator) can be a plus. Role & responsibilities Preferred candidate profile
Posted 1 week ago
6.0 - 10.0 years
7 - 14 Lacs
Bengaluru
Hybrid
Roles and Responsibilities Architect and incorporate an effective Data framework enabling end to end Data Solution. Understand business needs, use cases and drivers for insights and translate them into detailed technical specifications. Create epics, features and user stories with clear acceptance criteria for execution and delivery by the data engineering team. Create scalable and robust data solution designs that incorporate governance, security and compliance aspects. Develop and maintain logical and physical data models and work closely with data engineers, data analysts and data testers for successful implementation of them. Analyze, assess and design data integration strategies across various sources and platforms. Create project plans and timelines while monitoring and mitigating risks and controlling progress of the project. Conduct daily scrum with the team with a clear focus on meeting sprint goals and timely resolution of impediments. Act as a liaison between technical teams and business stakeholders and ensure. Guide and mentor the team for best practices on Data solutions and delivery frameworks. Actively work, facilitate and support the stakeholders/ clients to complete User Acceptance Testing ensure there is strong adoption of the data products after the launch. Defining and measuring KPIs/KRA for feature(s) and ensuring the Data roadmap is verified through measurable outcomes Prerequisites 5 to 8 years of professional, hands on experience building end to end Data Solution on Cloud based Data Platforms including 2+ years working in a Data Architect role. Proven hands on experience in building pipelines for Data Lakes, Data Lake Houses, Data Warehouses and Data Visualization solutions Sound understanding of modern Data technologies like Databricks, Snowflake, Data Mesh and Data Fabric. Experience in managing Data Life Cycle in a fast-paced, Agile / Scrum environment. Excellent spoken and written communication, receptive listening skills, and ability to convey complex ideas in a clear, concise fashion to technical and non-technical audiences Ability to collaborate and work effectively with cross functional teams, project stakeholders and end users for quality deliverables withing stipulated timelines Ability to manage, coach and mentor a team of Data Engineers, Data Testers and Data Analysts. Strong process driver with expertise in Agile/Scrum framework on tools like Azure DevOps, Jira or Confluence Exposure to Machine Learning, Gen AI and modern AI based solutions. Experience Technical Lead Data Analytics with 6+ years of overall experience out of which 2+ years is on Data architecture. Education Engineering degree from a Tier 1 institute preferred. Compensation The compensation structure will be as per industry standards
Posted 1 week ago
6.0 - 10.0 years
3 - 8 Lacs
Noida
Work from Office
Position: Snowflake - Senior Technical Lead Experience: 8-11 years Location: Noida/ Bangalore Education: B.E./ B.Tech./ MCA Primary Skills: Snowflake, Snowpipe, SQL, Data Modelling, DV 2.0, Data Quality, AWS, Snowflake Security Good to have Skills: Snowpark, Data Build Tool, Finance Domain Preferred Skills Experience with Snowflake-specific features: Snowpipe, Streams & Tasks, Secure Data Sharing. Experience in data warehousing, with at least 2 years focused on Snowflake. Hands-on expertise in SQL, Snowflake scripting (JavaScript UDFs), and Snowflake administration. Proven experience with ETL/ELT tools (e.g., dbt, Informatica, Talend, Matillion) and orchestration frameworks. Deep knowledge of data modeling techniques (star schema, data vault) and performance tuning. Familiarity with data security, compliance requirements, and governance best practices. Experience in Python, Scala, or Java for Snowpark development. Strong understanding of cloud platforms (AWS, Azure, or GCP) and related services (S3, ADLS, IAM) Key Responsibilities Define data partitioning, clustering, and micro-partition strategies to optimize performance and cost. Lead the implementation of ETL/ELT processes using Snowflake features (Streams, Tasks, Snowpipe). Automate schema migrations, deployments, and pipeline orchestration (e.g., with dbt, Airflow, or Matillion). Monitor query performance and resource utilization; tune warehouses, caching, and clustering. Implement workload isolation (multi-cluster warehouses, resource monitors) for concurrent workloads. Define and enforce role-based access control (RBAC), masking policies, and object tagging. Ensure data encryption, compliance (e.g., GDPR, HIPAA), and audit logging are correctly configured. Establish best practices for dimensional modeling, data vault architecture, and data quality. Create and maintain data dictionaries, lineage documentation, and governance standards. Partner with business analysts and data scientists to understand requirements and deliver analytics-ready datasets. Stay current with Snowflake feature releases (e.g., Snowpark, Native Apps) and propose adoption strategies. Contribute to the long-term data platform roadmap and cloud cost-optimization initiatives.
Posted 1 week ago
5.0 - 9.0 years
7 - 16 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Hiring for Snowflake Developer with experience range 2 years & above Mandatory Skills: Snowflake Developer, Snowflake, Snowpro Education: BE/B.Tech/MCA/M.Tech/MSc./MS Interview Mode-F2F
Posted 1 week ago
4.0 - 9.0 years
10 - 20 Lacs
Chandigarh
Hybrid
Design, build, and maintain scalable and reliable data pipelines on Databricks, Snowflake, or equivalent cloud platforms. Ingest and process structured, semi-structured, and unstructured data from a variety of sources including APIs, RDBMS, and file systems. Perform data wrangling, cleansing, transformation, and enrichment using PySpark, Pandas, NumPy, or similar libraries. Optimize and manage large-scale data workflows for performance, scalability, and cost-efficiency. Write and optimize complex SQL queries for transformation, extraction, and reporting. Design and implement efficient data models and database schemas with appropriate partitioning and indexing strategies for Data Warehouse or Data Mart. Leverage cloud services (e.g., AWS S3, Glue, Kinesis, Lambda) for storage, processing, and orchestration. Use orchestration tools like Airflow, Temporal, or AWS Step Functions to manage end-to-end workflows. Build containerized solutions using Docker and manage deployment pipelines via CI/CD tools such as Azure DevOps, GitHub Actions, or Jenkins. Collaborate closely with data scientists, analysts, and business stakeholders to understand requirements and deliver data solutions.
Posted 1 week ago
1.0 - 5.0 years
9 - 13 Lacs
Bengaluru
Work from Office
About The Role SDET Rest Assure ? Experience with Selenium on automating web applications using Java. Experience with testing APIs using API testing tools such as Postman. Experience with Rest Assured and automating REST web services. Experience in Cucumber BDD framework for creating test cases. Experience with Github, Maven, TestNg and CICD integration tools like Bamboo. Experience in writing SQL queries and knowledge in accessing DB like Snowflake, Mongo DB. Programming skills in Java. Experience in Agile Scrum methodology Experience with bug and backlog management Experience with Test Case Management systems Experience with surfacing & executing on continuous improvement initiatives within quality teams Experience with ExtentReport, Apache POI, Open CSV, MSSQL-JDBC, Bitbucket, JavaXmail, and/or Exclipse/Intellij a bonus Experience with Jira is a bonus Experience with Pricing and/or in the wholesale food distribution industries is a bonus Experience with quality focused metrics and auditing systems of test is a bonus ? ? NoPerformance ParameterMeasure1Understanding the test requirements and test case design of the productEnsure error free testing solutions, minimum process exceptions, 100% SLA compliance, # of automation done using VB, macros2Execute test cases and reportingTesting efficiency & quality, On-Time Delivery, Troubleshoot queries within TAT, CSAT score ? Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 week ago
6.0 - 10.0 years
8 - 12 Lacs
Gurugram
Work from Office
About The Role : Role Purpose Data Analyst, Data Modeling, Data Pipeline, ETL Process, Tableau, SQL, Snowflake. Do Strong expertise in data modeling, data warehousing, and ETL processes. - Proficient in SQL and experience with data warehousing tools (e.g., Snowflake, Redshift, BigQuery) and ETL tools (e.g., Talend, Informatica, SSIS). - Demonstrated ability to lead and manage complex projects involving cross-functional teams. - Excellent analytical, problem-solving, and organizational skills. - Strong communication and leadership abilities, with a track record of mentoring and developing team members. - Experience with data visualization tools (e.g., Tableau, Power BI) is a plus. - Preference to candidates with experience in ETL using Python, Airflow or DBT Build capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Partner with team leaders to brainstorm and identify training themes and learning issues to better serve the client Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback2Self- ManagementProductivity, efficiency, absenteeism, Training Hours, No of technical training completed
Posted 1 week ago
5.0 - 8.0 years
3 - 7 Lacs
Kolkata
Work from Office
About The Role Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters ? Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities ? 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders ? 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally ? Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries, Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Mandatory Skills: Snowflake. Experience5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 week ago
5.0 - 8.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Starburst Data Engineer/Architect Expertise in Starburst and policy management like Ranger or equivalent. In-depth knowledge of data modelling principles and techniques, including relational and dimensional. Excellent problem solving skills and the ability to troubleshoot and debug complex data related issues. Strong awareness of data tools and platforms like: Starburst, Snowflakes, Databricks and programming languages like SQL. In-depth knowledge of data management principles, methodologies, and best practices with excellent analytical, problem-solving and decision making skills. Develop, implement and maintain database systems using SQL. Write complex SQL queries for integration with applications. Develop and maintain data models (Conceptual, physical and logical) to meet organisational needs. ? Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customer??s business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement ? Deliver NoPerformance ParameterMeasure1.Analyses data sets and provide relevant information to the clientNo. Of automation done, On-Time Delivery, CSAT score, Zero customer escalation, data accuracy ? ? Mandatory Skills: Startburst. Experience5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 week ago
3.0 - 20.0 years
10 - 40 Lacs
Pune, Delhi / NCR, Greater Noida
Work from Office
Mandatory Skills - Snowflake, Matillion
Posted 1 week ago
2.0 - 7.0 years
6 - 16 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Hiring for Snowflake Developer with experience range 2 years & above Mandatory Skills: Snowflake Developer, Snowflake, Snowpro Education: BE/B.Tech/MCA/M.Tech/MSc./MS Interview Mode-F2F
Posted 1 week ago
10.0 - 13.0 years
25 - 37 Lacs
Gurugram
Work from Office
We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in! REQUIREMENTS: Total experience 9+ years. Hands-on experience in Big Data Engineering. Strong expertise in Apache Spark and PySpark/Python . Deep technical knowledge of AWS Glue (Crawler, Data Catalog). Hands on working experience in Python. Strong working experience with AWS services, including S3, Lambda, SNS, Secret Manager, and Athena. Proven experience with Infrastructure as Code using CloudFormation and Terraform. Solid experience in Snowflake. Proficiency in setting up and maintaining CI/CD pipelines with GitHub Actions. Familiarity with tools like Jira and GitHub. Strong communication and teamwork skills, with the ability to mentor and collaborate effectively. RESPONSIBILITIES: Understanding the clients business use cases and technical requirements and be able to convert them into technical design which elegantly meets the requirements Mapping decisions with requirements and be able to translate the same to developers Identifying different solutions and being able to narrow down the best option that meets the clients requirements Defining guidelines and benchmarks for NFR considerations during project implementation Writing and reviewing design documents explaining overall architecture, framework, and high-level design of the application for the developers Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it Understanding and relating technology integration scenarios and applying these learnings in projects Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken Carrying out POCs to make sure that suggested design/technologies meet the requirements
Posted 1 week ago
3.0 - 8.0 years
6 - 16 Lacs
Pune, Chennai, Bengaluru
Hybrid
Hiring for Snowflake Developer with experience range 2 years & above Mandatory Skills: Snowflake Education: BE/B.Tech/MCA/M.Tech/MSc./MS Location- PAN INDIA
Posted 1 week ago
8.0 - 13.0 years
15 - 30 Lacs
Hyderabad
Work from Office
Role & responsibilities Details on tech stack databricks, python, pyspark, Snowflake, SQL Min requirements to the candidate Advanced SQL queries, scripts, stored procedures, materialized views, and views Focus on ELT to load data into database and perform transformations in database Ability to use analytical SQL functions Snowflake experience Cloud Data Warehouse solutions experience (Snowflake, Azure DW, or Redshift); data modeling, analysis, programming Experience with DevOps models utilizing a CI/CD tool Work in hands-on Cloud environment in Azure Cloud Platform (ADLS, Blob) Airflow GD Requirements Good interpersonal skills; comfort and competence in dealing with different teams within the organization. Requires an ability to interface with multiple constituent groups and build sustainable relationships. Strong and effective communication skills (verbal and written). Strong analytical, problem-solving skills. Experience of working in a matrix organization. Proactive problem solver. Ability to prioritize and deliver. Results-oriented, flexible, adaptable. Work well independently and lead a team. Versatile, creative temperament, ability to think out-of-the box while defining sound and practical solutions. Ability to master new skills. Familiar with Agile practices and methodologies Professional data engineering experience focused on batch and real-time data pipelines using Spark, Python, SQL Data warehouse (data modeling, programming) Experience working with Snowflake Experience working on a cloud environment, preferably, Microsoft Azure Cloud Data Warehouse solutions (Snowflake, Azure DW) Preferred candidate profile
Posted 1 week ago
3.0 - 8.0 years
6 - 16 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Hiring for Snowflake Developer with experience range 2 years & above Mandatory Skills: Snowflake Education: BE/B.Tech/MCA/M.Tech/MSc./MS Location- PAN INDIA
Posted 1 week ago
8.0 - 12.0 years
15 - 30 Lacs
Hyderabad
Hybrid
Job Title: Data Architect / Data Modeler Experience Level: 8+ Years Location: Hyderabad Job Summary We are seeking a highly experienced Data Architect to join our growing Data & Analytics team. This role demands a strategic thinker and technical expert who can design and build robust, scalable, and efficient data solutions. You will play a critical role in architecting end-to-end data pipelines, designing optimized data models, and delivering business-centric data infrastructure using cutting-edge technologies such as Python, PySpark, SQL, Snowflake , and/or Databricks . The ideal candidate will have a deep understanding of data engineering best practices and a proven track record of enabling data-driven decision-making through innovative and scalable data solutions. Key Responsibilities Architect & Design Scalable Data Pipelines Lead the design and implementation of high-performance, scalable, and maintainable data pipelines that support batch and real-time processing. Data Modeling & Data Architecture Design and implement optimized data models and database schemas to support analytics, reporting, and machine learning use cases. Cloud Data Platforms Develop and manage modern cloud-based data architectures using platforms like Snowflake or Databricks , ensuring performance, security, and cost-efficiency. Data Integration & ETL Development Build robust ETL/ELT workflows to ingest, transform, and provision data from a variety of internal and external sources. Collaboration with Stakeholders Work closely with data analysts, data scientists, product managers, and business leaders to translate business requirements into technical specifications and data solutions. Data Quality & Governance Implement and advocate for best practices in data quality, security, compliance, lineage, and governance. Performance Optimization Optimize data storage and query performance using advanced SQL, partitioning, indexing, caching strategies, and compute resource tuning. Mentorship & Best Practices Provide mentorship to junior engineers, establish coding standards, and contribute to the growth and maturity of the data engineering practice. Required Qualifications Bachelors or Master’s degree in Computer Science, Engineering, Data Science, or a related field. 8+ years of experience in data engineering or related roles. Strong expertise in Python and PySpark for data processing and transformation. Proficient in advanced SQL with a deep understanding of query optimization and performance tuning. Hands-on experience with Snowflake and/or Databricks in a production environment. Experience in designing and implementing data warehouses and data lakes. Solid understanding of distributed computing frameworks, big data ecosystems, and modern data architecture patterns. Experience with CI/CD, version control systems (e.g., Git), and workflow orchestration tools (e.g., Airflow, dbt, etc.). Strong communication skills with the ability to clearly articulate technical concepts to non-technical stakeholders. Role & responsibilities Preferred candidate profile
Posted 1 week ago
1.0 - 3.0 years
2 - 5 Lacs
Chennai
Work from Office
Create test case documents/plan for testing the Data pipelines. Check the mapping for the fields that support data staging and in data marts & data type constraints of the fields present in snowflake Verify non-null fields are populated. Verify Business requirements and confirm if the correct logic Is implemented in the transformation layer of ETL process. Verify stored procedure calculations and data mappings. Verify data transformations are correct based on the business rules. Verify successful execution of data loading workflows
Posted 1 week ago
1.0 - 4.0 years
4 - 7 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Locations-Pune, Bangalore, Hyderabad, Indore Contract duration 6 month Responsibilities - Must have experience working as a Snowflake Admin/Development in Data Warehouse, ETL, BI projects. - Must have prior experience with end to end implementation of Snowflake cloud data warehouse and end to end data warehouse implementations on-premise preferably on Oracle/Sql server. - Expertise in Snowflake - data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts - Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, - Zero copy clone, time travel and understand how to use these features - Expertise in deploying Snowflake features such as data sharing. - Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python - Experience in Data Migration from RDBMS to Snowflake cloud data warehouse - Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling) - Experience with data security and data access controls and design- - Experience with AWS or Azure data storage and management technologies such as S3 and Blob - Build processes supporting data transformation, data structures, metadata, dependency and workload management- - Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot. - Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface. - Must have experience of Agile development methodologies. Good to have - CI/CD in Talend using Jenkins and Nexus. - TAC configuration with LDAP, Job servers, Log servers, database. - Job conductor, scheduler and monitoring. - GIT repository, creating user & roles and provide access to them. - Agile methodology and 24/7 Admin and Platform support. - Estimation of effort based on the requirement. - Strong written communication skills. Is effective and persuasive in both written and oral communication.
Posted 1 week ago
3.0 - 8.0 years
4 - 8 Lacs
Mumbai, Hyderabad, Bengaluru
Work from Office
We are looking for a skilled AI Engineer with 3 to 8 years of experience in software engineering or machine learning to design, implement, and productionize LLM-powered agents that solve real-world enterprise problems. This position is based in Kolkata. Roles and Responsibility Architect and build multi-agent systems using frameworks such as LangChain, LangGraph, AutoGen, Google ADK, Palantir Foundry, or custom orchestration layers. Fine-tune and prompt-engineer LLMs (OpenAI, Anthropic, open-source) for retrieval-augmented generation (RAG), reasoning, and tool use. Integrate agents with enterprise data sources (APIs, SQL/NoSQL DBs, vector stores like Pinecone, Elasticsearch) and downstream applications (Snowflake, ServiceNow, custom APIs). Own the MLOps lifecycle: containerize (Docker), automate CI/CD, monitor drift & hallucinations, set up guardrails, observability, and rollback strategies. Collaborate cross-functionally with product, UX, and customer teams to translate requirements into robust agent capabilities and user-facing features. Benchmark and iterate on latency, cost, and accuracy; design experiments, run A/B tests, and present findings to stakeholders. Job Requirements Strong Python skills (async I/O, typing, testing) plus familiarity with TypeScript/Node or Go is a bonus. Hands-on experience with at least one LLM/agent framework and platform (LangChain, LangGraph, Google ADK, LlamaIndex, Emma, etc.). Solid grasp of vector databases (Pinecone, Weaviate, FAISS) and embedding models. Experience building and securing REST/GraphQL APIs and microservices. Cloud skills on AWS, Azure, or GCP (serverless, IAM, networking, cost optimization). Proficient with Git, Docker, CI/CD (GitHub Actions, GitLab CI, or similar).
Posted 1 week ago
7.0 - 10.0 years
8 - 15 Lacs
Hyderabad, Bengaluru
Hybrid
Key Responsibilities: Use data mappings and models provided by the data modeling team to build robust Snowflake data pipelines . Design and implement pipelines adhering to 2NF/3NF normalization standards . Develop and maintain ETL processes for integrating data from multiple ERP and source systems . Build scalable and secure Snowflake data architecture supporting Data Quality (DQ) needs. Raise CAB requests via Carriers change process and manage production deployments . Provide UAT support and ensure smooth transition of finalized pipelines to support teams. Create and maintain comprehensive technical documentation for traceability and handover. Collaborate with data modelers, business stakeholders, and governance teams to enable DQ integration. Optimize complex SQL queries , perform performance tuning , and ensure data ops best practices . Requirements: Strong hands-on experience with Snowflake Expert-level SQL skills and deep understanding of data transformation Solid grasp of data architecture and 2NF/3NF normalization techniques Experience with cloud-based data platforms and modern data pipeline design Exposure to AWS data services like S3, Glue, Lambda, Step Functions (preferred) Proficiency with ETL tools and working in Agile environments Familiarity with Carrier CAB process or similar structured deployment frameworks Proven ability to debug complex pipeline issues and enhance pipeline scalability Strong communication and collaboration skills Role & responsibilities Preferred candidate profile
Posted 1 week ago
8.0 - 13.0 years
15 - 30 Lacs
Hyderabad
Hybrid
Job Description: Advanced SQL queries, scripts, stored procedures, materialized views, and views Focus on ELT to load data into database and perform transformations in database Ability to use analytical SQL functions Snowflake experience Cloud Data Warehouse solutions experience (Snowflake, Azure DW, or Redshift); data modeling, analysis, programming Experience with DevOps models utilizing a CI/CD tool Work in hands-on Cloud environment in Azure Cloud Platform (ADLS, Blob) Airflow Preferred candidate profile Good interpersonal skills; comfort and competence in dealing with different teams within the organization. Requires an ability to interface with multiple constituent groups and build sustainable relationships. Strong and effective communication skills (verbal and written). Strong analytical, problem-solving skills. Experience of working in a matrix organization. Proactive problem solver. Ability to prioritize and deliver. Results-oriented, flexible, adaptable. Work well independently and lead a team. Versatile, creative temperament, ability to think out-of-the box while defining sound and practical solutions. Ability to master new skills. Familiar with Agile practices and methodologies Professional data engineering experience focused on batch and real-time data pipelines using Spark, Python, SQL Data warehouse (data modeling, programming) Experience working with Snowflake Experience working on a cloud environment, preferably, Microsoft Azure Cloud Data Warehouse solutions (Snowflake, Azure DW)
Posted 1 week ago
3.0 - 6.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Skills: Microsoft Azure, Hadoop, Spark, Databricks, Airflow, Kafka, Py spark RequirmentsExperience working with distributed technology tools for developing Batch and Streaming pipelines using. SQL, Spark, Python Airflow Scala Kafka Experience in Cloud Computing, e.g., AWS, GCP, Azure, etc. Able to quickly pick up new programming languages, technologies, and frameworks. Strong skills building positive relationships across Product and Engineering. Able to influence and communicate effectively, both verbally and written, with team members and business stakeholders Experience with creating/ configuring Jenkins pipeline for smooth CI/CD process for Managed Spark jobs, build Docker images, etc. Working knowledge of Data warehousing, Data modelling, Governance and Data Architecture Experience working with Data platforms, including EMR, Airflow, Data bricks (Data Engineering & Delta Lake components) Experience working in Agile and Scrum development process. Experience in EMR/ EC2, Data bricks etc. Experience working with Data warehousing tools, including SQL database, Presto, and Snowflake Experience architecting data product in Streaming, Server less and Microservices Architecture and platform.
Posted 1 week ago
7.0 - 8.0 years
4 - 6 Lacs
Mumbai, Hyderabad, Chennai
Work from Office
Data Engineer (Contract | 6 Months) We are seeking an experienced Data Engineer to join our team for a 6-month contract assignment. The ideal candidate will work on data warehouse development, ETL pipelines, and analytics enablement using Snowflake, Azure Data Factory (ADF), dbt, and other tools. This role requires strong hands-on experience with data integration platforms, documentation, and pipeline optimizationespecially in cloud environments such as Azure and AWS. #KeyResponsibilities Build and maintain ETL pipelines using Fivetran, dbt, and Azure Data Factory Monitor and support production ETL jobs Develop and maintain data lineage documentation for all systems Design data mapping and documentation to aid QA/UAT testing Evaluate and recommend modern data integration tools Optimize shared data workflows and batch schedules Collaborate with Data Quality Analysts to ensure accuracy and integrity of data flows Participate in performance tuning and improvement recommendations Support BI/MDM initiatives including Data Vault and Data Lakes #RequiredSkills 7+ years of experience in data engineering roles Strong command of SQL, with 5+ years of hands-on development Deep experience with Snowflake, Azure Data Factory, dbt Strong background with ETL tools (Informatica, Talend, ADF, dbt, etc.) Bachelor's in CS, Engineering, Math, or related field Experience in healthcare domain (working with PHI/PII data) Familiarity with scripting/programming (Python, Perl, Java, Linux-based environments) Excellent communication and documentation skills Experience with BI tools like Power BI, Cognos, etc. Organized, self-starter with strong time-management and critical thinking abilities #NiceToHave Experience with Data Lakes and Data Vaults QA & UAT alignment with clear development documentation Multi-cloud experience (especially Azure, AWS)
Posted 1 week ago
3.0 - 7.0 years
8 - 12 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
About Us At SentinelOne, were redefining cybersecurity by pushing the limits of whats possible?leveraging AI-powered, data-driven innovation to stay ahead of tomorrows threats From building industry-leading products to cultivating an exceptional company culture, our core values guide everything we do Were looking for passionate individuals who thrive in collaborative environments and are eager to drive impact If youre excited about solving complex challenges in bold, innovative ways, wed love to connect with you Who are we The Data team is tasked with providing a world-class data platform that enables unrivalled cost, performance, and scalability for SentinelOne and our customers The exponential growth in volumes of data, users of data, and types of data calls for a new modern architecture that addresses the new data requirements for enterprise organizations Help us get this platform into the hands of customers and support them in their mission to affordably collect and retain their most critical asset data SentinelOne is shaping the converged future of security and data through its unified data platform This is a unique opportunity to operate in an emerging ?startuplike environment within SentinelOne to build and scale our data business beyond just security use cases What are we looking for We are looking for a team member who puts the customer first and is passionate about solving problems with creativity, compassion, and technical acumen You will need to bring a combination of technical, business, strategic and problem-solving skills to the team to support pre-sales efforts and as a data subject matter expert to the larger SentinelOne team Looking for an individual who is smart, passionate about data, and who brings a sense of joy and teamwork to everything they do As a Sr Solutions Engineer, you will illustrate SentinelOne's value to prospective customers We need a self-starter who excels in a high-paced startup environment and thrives on pitching revolutionary technology to many areas of an organisation, including C-level executives, security engineers, IT operations, DevOps, and Engineering professionals They should be willing to ?wear many hats? and step up and drive solutions to problems related to external and internal needs This individual will be instrumental in accelerating our sales, strategic initiatives, and growing SentinelOne What skills and knowledge should you bring 5+ years of experience as a Solutions (Sales) Engineer or Architect BS/BA degree or equivalent technical experience is desired, but love a well-rounded candidate with a broad range of interests and talents Strong background with big data platforms (Cassandra, Hadoop, etc-), data lakes (Snowflake, DataBricks), streaming analytics (Kafka), log management (ElasticSearch, SumoLogic, etc-), or SIEM (Splunk, Devo, Qradar, Exabeam, etc-) Some code writing proficiency is desired (C/C++, Shell, Perl, Python) Experience with RegEx and writing parsers Background in cloud providers (AWS, Azure, Google)and technologies such as Kubernetes Ability to demonstrate product value and use cases, both customer-specific and generic Demonstrable experience in objection handling and positioning against competitive or alternative technologies, including how to transition to new data pipelines Use concise written and oral communication skills to effectively lead business and technical presentations, demonstrations, and conversations with both executives and technical audiences Fluency in English is required Must have demonstrable experience successfully selling to mid-to-large customers and working across an organisation to get technical buy-in and acceptance Drive the Evaluation/POC through a defined process Provide timely consultation and build a strong relationship with the technical buyer or champion Provide 1st-level technical support throughout the sales process with involvement as it is transitioned to customer success Availability to travel to visit prospects and customers (usually no more than 20-25% and as required) What will you do The principal responsibilities for this position are to generate revenue from Strategic Accounts across the region through following up on multiple lead sources, developing new clients and selling directly to customers while leveraging our channel community In this position, you will: Run a sophisticated sales process from prospecting to closure Partner with our channel team to drive both net new and recurring revenue Partner with channel managers to build pipeline and grow the assigned territory Become an insider within the Cyber Security Industry and become an expert in SentinelOne products Stay well educated and informed about SentinelOne's competitive landscape and how to sell the value of our solutions and services when compared to the relevant competitors in the Next Generation Endpoint market space Consistently meet or exceed sales quotas Why us You will be joining a cutting-edge company where you will tackle extraordinary challenges and work with the very best in the industry Health Insurance Industry-leading gender-neutral parental leave Paid Company Holidays Paid Sick Time Employee stock purchase program Employee assistance program Gym membership reimbursement Wifi/Cell phone reimbursement Numerous company-sponsored events, including regular happy hours and team-building events SentinelOne is proud to be an Equal Employment Opportunity and Affirmative Action employer We do not discriminate based upon race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics SentinelOne participates in the E-Verify Program for all U S based roles Show more Show less
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2