Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 5.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and enhancements to existing applications while maintaining a focus on quality and efficiency. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling and ETL processes.- Experience with SQL and database management.- Familiarity with cloud computing concepts and services.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 4 days ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications function seamlessly within the business environment. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking ways to improve processes and solutions. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data migration concepts using external stage, internal stage and proficient in SQL stored procedures.knowledge on creating tasks and snow pipe concepts.- Must have either one of the below skills:a)Good knowledge in python and able to build ML models and aware of using snowpark libraries.b)Good knowledge on semiconducator and supplychain functional concepts- Strong understanding of SQL and database management.- Experience in creating complex SQL stored procedures and good in using SQL and different in-build functions.- Good in SQL performance tuning techniques and optimization techniques.- Familiarity with cloud computing concepts and services.-Candidate should be flexible to work/attend meetings during US overlap timings. Additional Information:- The candidate should have minimum 3 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 4 days ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : Shell ScriptingMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. A typical day involves collaborating with cross-functional teams to gather insights, analyzing user needs, and translating them into functional specifications. You will engage in discussions to refine application designs and ensure alignment with business objectives, while also participating in testing and validation processes to guarantee that the applications meet the established requirements. Your role will be pivotal in driving the development of innovative solutions that enhance operational efficiency and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with stakeholders to gather and analyze requirements for application design.- Participate in the testing and validation of applications to ensure they meet business needs. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with Shell Scripting.- Strong understanding of data modeling and ETL processes.- Experience with SQL and database management.- Familiarity with cloud-based data solutions and architecture. Additional Information:- The candidate should have minimum 3 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 4 days ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : Shell ScriptingMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. A typical day involves collaborating with cross-functional teams to gather insights, analyzing user needs, and translating them into functional specifications. You will engage in discussions to refine application designs and ensure alignment with business objectives, while also participating in testing and validation processes to guarantee that the applications meet the established requirements. Your role will be pivotal in driving the development of innovative solutions that enhance operational efficiency and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with stakeholders to gather and analyze requirements for application design.- Participate in the testing and validation of applications to ensure they meet business needs. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with Shell Scripting.- Strong understanding of data modeling and ETL processes.- Experience with SQL and database management.- Familiarity with cloud-based data solutions and architecture. Additional Information:- The candidate should have minimum 3 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 4 days ago
2.0 - 5.0 years
3 - 6 Lacs
Bengaluru
Work from Office
Project Role : Operations Engineer Project Role Description : Support the operations and/or manage delivery for production systems and services based on operational requirements and service agreement. Must have skills : Microsoft SQL Server Administration Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Operations Engineer, you will support the operations and manage delivery for production systems and services based on operational requirements and service agreements. Your typical day will involve monitoring system performance, troubleshooting issues, and collaborating with various teams to ensure seamless operations. You will also engage in planning and executing maintenance activities to enhance system reliability and efficiency, while ensuring compliance with established protocols and standards. Technical skills :- Good knowledge in almost all core Database ( RDBMS ( Oracle , MS SQL , DB2) & NO-SQL ( MongoDB / Casandra/ Snowflake / Open search , Dynamo DB etc.) ) Very good knowledge in scripting language ( YAML, JSON, Power Sheel) to convert future requirements in minimal time span . Good understanding of different licensing models ( Oracle , Microsoft ) Good understanding on IAM tools for integration ( OKTA , Centrify , CyberArk) Design & define architecture of new implementation / migration / onboarding strategy on DB space. Project management :- Work closely with business or Steve ( service owner) for all new technologies from functional stand point .Work with SRB team to ensure all technologies under Service owner portfolio are active , work on life cycle management.Work with ISO team to set up user management , profile & standards & access policies .Work on adhoc projects to ensure business gets resources on time .Good SRE skills . Technical Expertise:Strong knowledge of database technologies, including relational databases, NoSQL databases, and data warehousing. Design & define reference architectures of new versions. AWS RDS experience is highly desiredProgramming Languages:Proficiency in SQL and other programming languages used in database development (e.g., Python, Java).Data Modeling:Experience with data modeling techniques and tools.Operating Systems:Knowledge of relevant operating systems (e.g., Windows, Linux) and their interaction with database systems.Problem-Solving:Ability to identify, analyze, and resolve complex database issues.Communication and Collaboration:Strong communication and collaboration skills to work effectively with development teams and other stakeholders.Security:Good understanding on IAM tools for integration also Understanding of data security principles and best practices. Work with ISO team to set up user management , profile & standards & access policiesCost Optimization:Good understanding of different licensing models (RDS, Oracle , Microsoft ) 1. Good knowledge of other database ( MYSQL, Postgres and No-SQL ( Mongo DB) )2. Hands on experience on Cloud databases ( Upgradation , Migration , Load balacer set up , DR, replication set up )3. Hands on experience on Automation ( ANSIBLE. Terraform )4. Good in Project execution on reducing cost , Automating report)5. Work with Audit team to stablize & maintain standards Qualification 15 years full time education
Posted 4 days ago
8.0 - 13.0 years
20 - 30 Lacs
Bengaluru
Work from Office
Description: We are looking for a skilled Power BI Developer with Snowflake expertise to join our Analytics and Insights Team. As a Power BI developer, you will be responsible for designing, developing and maintaining business intelligence solutions using Power BI and Snowflake, enabling data driven decisioning within Enterprise Network Services organization. Skills Required: Hands on Power BI Developer with 3+ years of experience in report creation, data modelling and data visualization leveraging Power BI and Snowflake. Strong understanding of DAX, Power BI's data modelling features. 1+ year of experience working with Snowflake database, with strong understanding of database design and development concepts and Snowflake's SQL dialect. Hands on experience in analyzing complex data sets and building critical business insights. Strong understanding of SDLC methodology, DevOps concepts. Experience in working as Agile Team member with good understanding of Agile principles. Excellent Problem Solving and Communication skills.
Posted 4 days ago
5.0 - 8.0 years
10 - 14 Lacs
Gurugram
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Apache Spark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary As an Application Lead, you will be responsible for designing, building, and configuring applications. Acting as the primary point of contact, you will lead the development team, oversee the delivery process, and ensure successful project execution. Roles & ResponsibilitiesAct as a Subject Matter Expert (SME) in application developmentLead and manage a development team to achieve performance goalsMake key technical and architectural decisionsCollaborate with cross-functional teams and stakeholdersProvide technical solutions to complex problems across multiple teamsOversee the complete application development lifecycleGather and analyze requirements in coordination with stakeholdersEnsure timely and high-quality delivery of projects Professional & Technical SkillsMust-Have SkillsProficiency in Apache SparkStrong understanding of big data processingExperience with data streaming technologiesHands-on experience in building scalable, high-performance applicationsKnowledge of cloud computing platformsMust-Have Additional SkillsPySparkSpark SQL / SQLAWS Additional InformationThis is a full-time, on-site role based in GurugramCandidates must have a minimum of 5 years of hands-on experience with Apache SparkA minimum of 15 years of full-time formal education is mandatory Qualification 15 years full time education
Posted 4 days ago
2.0 - 4.0 years
4 - 8 Lacs
Pune
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs, while also troubleshooting any issues that arise in the data flow. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve data processes to enhance efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with data modeling and database design.- Strong understanding of ETL processes and data integration techniques.- Familiarity with cloud platforms and data storage solutions.- Experience in performance tuning and optimization of data queries. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education
Posted 4 days ago
6.0 - 11.0 years
15 - 27 Lacs
Pune, Gurugram, Bengaluru
Hybrid
Role: Azure Data Engineer Location: Pune, Gurgaon, or Bangalore Work Mode - Hybrid Key Role and Responsibilities: Design and develop ETL/ELT pipelines using Azure Data Factory, Snowflake, and DBT. o Build and maintain data integration workflows from various data sources to Snowflake. Write efficient and optimized SQL queries for data extraction and transformation. Work with stakeholders to understand business requirements and translate them into technical solutions. o Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Maintain and enforce data quality, governance, and documentation standards. Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment. Must Have: Strong experience with Azure Cloud Platform services. Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines. Proficiency in SQL for data analysis and transformation. Hands-on experience with Snowflake and SnowSQL for data warehousing. Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse. Experience working in cloud-based data environments with large-scale datasets Good to Have: Experience with Azure Data Lake, Azure Synapse, or Azure Functions. Familiarity with Python or PySpark for custom data transformations. Understanding of CI/CD pipelines and DevOps for data workflows. Exposure to data governance, metadata management, or data catalog tools. Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus. Education: Bachelor's degree in computer science, Software Engineering, MIS, or equivalent combination of education and experience. Key Skills: Azure [Data Factory, Data Bricks], Snowflake, DBT
Posted 4 days ago
6.0 - 11.0 years
15 - 27 Lacs
Pune, Gurugram, Bengaluru
Hybrid
Role: Azure Data Engineer Location: Pune/Gurugram/ Bangalore/ Hyderabad. Work Mode : Hybrid Key Role and Responsibilities: Design and develop ETL/ELT pipelines using Azure Data Factory, Snowflake, and DBT. o Build and maintain data integration workflows from various data sources to Snowflake. Write efficient and optimized SQL queries for data extraction and transformation. Work with stakeholders to understand business requirements and translate them into technical solutions. o Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Maintain and enforce data quality, governance, and documentation standards. Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment. Must Have: Strong experience with Azure Cloud Platform services. Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines. Proficiency in SQL for data analysis and transformation. Hands-on experience with Snowflake and SnowSQL for data warehousing. Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse. Experience working in cloud-based data environments with large-scale datasets. Good to Have: Experience with Azure Data Lake, Azure Synapse, or Azure Functions. Familiarity with Python or PySpark for custom data transformations. Understanding of CI/CD pipelines and DevOps for data workflows. Exposure to data governance, metadata management, or data catalog tools. Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus. Education : Bachelors degree in computer science, Software Engineering, MIS or equivalent combination of education and experience. Key Skills: Azure [Data Factory, Data Bricks], Snowflake, DBT
Posted 4 days ago
5.0 - 10.0 years
15 - 30 Lacs
Noida, Gurugram, Delhi / NCR
Work from Office
Requirement : Data Architect & Business Intelligence Experience: 9+ Years Location: Gurgaon (Remote) Preferred: Immediate Joiners Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills.
Posted 4 days ago
4.0 - 9.0 years
4 - 8 Lacs
Pune
Work from Office
Experience: 4+ Years. Expertise in Python Language is MUST. SQL (should be able to write complex SQL Queries) is MUST Hands on experience in Apache Flink Streaming Or Spark Streaming MUST Hands On expertise in Apache Kafka experience is MUST Data Lake Development experience. Orchestration (Apache Airflow is preferred). Spark and Hive: Optimization of Spark/PySpark and Hive apps Trino/(AWS Athena) (Good to have) Snowflake (good to have). Data Quality (good to have). File Storage (S3 is good to have) Our Offering:- Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment. Wellbeing programs & work-life balance - integration and passion sharing events. Attractive Salary and Company Initiative Benefits Courses and conferences. Attractive Salary. Hybrid work culture.
Posted 4 days ago
6.0 - 11.0 years
22 - 25 Lacs
Hyderabad
Hybrid
Proficiency in Python and SQL Hands-on experience with big data processing using PySpark/Spark Experience with Snowflake or Databricks Familiarity with AWS services, particularly in data pipeline development (Glue, Athena, Crawler, S3, Lambda, Redshift, EMR) Strong preference will be given to candidates with experience in: End-to-end data pipeline design Operationalizing machine learning models
Posted 4 days ago
3.0 - 7.0 years
5 - 9 Lacs
Navi Mumbai, Mahape
Work from Office
Responsibilities: Design and implement data obfuscation strategies using Thales CipherTrust Tokenization, FPE, and Data Masking modules. Define and build reusable obfuscation templates and policies based on data classification, sensitivity, and business use cases. Apply obfuscation to PII, PCI, and other regulated data across databases, applications, files, APIs, and cloud services. Install, configure, and administer CipherTrust Manager, DSM, and related connectors. Develop and deploy integration patterns with enterprise systems (e.g., Oracle, SQL Server, Kafka, Snowflake, Salesforce, AWS, Azure). Automate policy deployment and secret rotation using APIs, CLI, or scripting tools (e.g., Ansible, Terraform, Python, Shell). Work with or integrate secondary data protection tools (e.g. DLP, IRM, MIP etc.) Design, deploy, and manage PKI systems, including root and subordinate certificate authorities (CAs). Manage digital certificate lifecycles for users, devices, and services. Ensure secure storage and access controls for encryption/decryption and signing keys. Enable cross-platform key management and policy consistency for hybrid and multi-cloud environments. Align obfuscation patterns with internal data protection standards, classification schemes, and regulatory frameworks (GDPR, CCPA, DORA, SEBI, etc.). Provide obfuscation logs and audit evidence to support security assessments, audits, and compliance reviews. Implement monitoring and alerting for obfuscation control failures, anomalies, or unauthorized access attempts. Create detailed technical documentation, SOPs, and configuration guides. Train internal engineering teams and application owners on how to securely integrate with obfuscation services. Collaborate with data governance, security architecture, and DevSecOps teams to drive secure-by-design initiatives. Knowledge, Skill, Experience Required: Required: 3-6 years of hands-on experience with Thales CipherTrust Data Security Platform (Tokenization, DSM, FPE, Masking). Strong knowledge of data protection concepts: tokenization (deterministic and random), pseudonymization, static/dynamic masking, and encryption . Experience integrating obfuscation solutions with databases (Oracle, SQL, PostgreSQL, etc.), enterprise apps, and data pipelines. Proficiency in scripting and automation tools: Python, Shell, REST APIs, Ansible, CI/CD pipelines. Familiarity with key management, HSM integration, and data access policies. Beneficial: Thales Certified Engineer / Architect CipherTrust CISSP, CISA, CDPSE, or CIPT will be a binus Cloud Security Certification (e.g., AWS Security Specialty, Azure SC-300) Personal Characteristics : Strong analytical and problem-solving mindset Ability to work independently in a fast-paced, global enterprise environment Excellent documentation and communication skills Comfortable collaborating with cross-functional teams (App Dev, Security, Compliance, Data Governance) Experience supporting enterprise data security transformations and data-centric protection strategies
Posted 5 days ago
15.0 - 20.0 years
45 - 55 Lacs
Pune, Bengaluru
Work from Office
Job Description for a Senior Solution Architect – Data & Cloud Job Title: Senior Solution Architect – Data & Cloud Experience: 12+ Years Location: Hybrid / Remote Practice: Migration Works Employment Type: Full-time About Company: We are a data and analytics firm that provides the strategies, tools, capability and capacity that businesses need to turn their data into a competitive advantage. USEReady partners with cloud and data ecosystem leaders like Tableau, Salesforce, Snowflake, Starburst and Amazon Web Services, and has been named Tableau partner of the year multiple times. Headquartered in NYC, the company has 450 employees across offices in the U.S., Canada, India and Singapore and specializes in financial services. USEReady’s deep analytics expertise, unique player/coach approach and focus on fast results makes the company a perfect partner for a cloud-first, digital world. About the Role: We are looking for a highly experienced Senior Solution Architect to join our Migration Works practice, specializing in modern data platforms and visualization tools. The ideal candidate will bring deep technical expertise in Tableau, Power BI, AWS, and Snowflake, along with strong client-facing skills and the ability to design scalable, high-impact data solutions. You will be at the forefront of driving our AI driven migration and modernization initiatives, working closely with customers to understand their business needs and guiding delivery teams to success. Key Responsibilities: Solution Design & Architecture Lead the end-to-end design of cloud-native data architecture using AWS, Snowflake, and Azure stack. Translate complex business requirements into scalable and efficient technical solutions. Architect modernization strategies for legacy BI systems to cloud-native platforms. Client Engagement Conduct technical discussions with enterprise clients and stakeholders to assess needs and define roadmap. Act as a trusted advisor during pre-sales and delivery phases, showcasing technical leadership and consultative approach. Migration & Modernization Design frameworks for data platform migration (from on-premise to cloud), data warehousing, and analytics transformation. Support estimation, planning, and scoping of migration projects. Team Leadership & Delivery Oversight Guide and mentor delivery teams across geographies, ensuring solution quality and alignment to client goals. Support delivery by providing architectural oversight and resolving design bottlenecks. Conduct technical reviews, define best practices, and uplift the team’s capabilities. Required Skills & Experience: 15+ years of progressive experience in data and analytics, with at least 5 years in solution architecture roles. Strong hands-on expertise in: Tableau And Power BI – dashboard design, visualization architecture, and migration from legacy BI tools. AWS – S3, Redshift, Glue, Lambda, and data pipeline components. Snowflake – Architecture, Snowconvert, data modeling, security, and performance optimization. Experience in migrating legacy platforms (e.g., Cognos, BO, Qlik) to modern BI/Cloud-native stacks like Tableau and Power BI. Proven ability to interface with senior client stakeholders, understand business problems, and propose architectural solutions. Strong leadership, communication, and mentoring skills. Familiarity with data governance, security, and compliance in cloud environments. Preferred Qualifications: AWS/Snowflake certifications are a strong plus. Exposure to data catalog, lineage tools, and metadata management. Knowledge of ETL/ELT tools such as Talend, Informatica, or dbt. Prior experience working in consulting or fast-paced client services environments. What We Offer: Opportunity to work on cutting-edge AI led cloud and data migration projects. A collaborative and high-growth environment with room to shape future strategy. Access to learning programs, certifications, and technical leadership exposure.
Posted 5 days ago
5.0 - 8.0 years
12 - 18 Lacs
Bengaluru
Work from Office
• Bachelor's degree in computer science, Information Technology, or a related field. • 3-5 years of experience in ETL development and data integration. • Proficiency in SQL and experience with relational databases such as Oracle, SQL Server, or MySQL. • Familiarity with data warehousing concepts and methodologies. • Hands-on experience with ETL tools like Informatica, Talend, SSIS, or similar. • Knowledge of data modeling and data governance best practices. • Strong analytical skills and attention to detail. • Excellent communication and teamwork skills. • Experience with Snowflake or willingness to learn and implement Snowflake-based solutions. • Experience with Big Data technologies such as Hadoop or Spark. • Knowledge of cloud platforms like AWS, Azure, or Google Cloud and their ETL services. • Familiarity with data visualization tools such as Tableau or Power BI. • Hands-on experience with Snowflake for data warehousing and analytics
Posted 5 days ago
10.0 - 14.0 years
35 - 45 Lacs
Hyderabad
Work from Office
About the Team At DAZN, the Analytics Engineering team is at the heart of turning hundreds of data points into meaningful insights that power strategic decisions across the business. From content strategy to product engagement, marketing optimization to revenue intelligence we enable scalable, accurate, and accessible data for every team. The Role We're looking for a Lead Analytics Engineer to take ownership of our analytics data Pipeline and play a pivotal role in designing and scaling our modern data stack. This is a hands-on technical leadership role where you'll shape the data models in dbt/ Snowflake , orchestrate pipelines using Airflow , and enable high-quality, trusted data for reporting. Key Responsibilities Lead the development and governance of DAZNs semantic data models to support consistent, reusable reporting metrics. Architect efficient, scalable data transformations on Snowflake using SQL/DBT and best practices in data warehousing. Manage and enhance pipeline orchestration with Airflow , ensuring timely and reliable data delivery. Collaborate with stakeholders across Product, Finance, Marketing, and Technology to translate requirements into robust data models. Define and drive best practices in version control, testing, CI/CD for analytics workflows. Mentor and support junior engineers, fostering a culture of technical excellence and continuous improvement. Champion data quality, documentation, and observability across the analytics layer. You'll Need to Have 10+ years of experience in data/analytics engineering, with 2+ years leading or mentoring engineers . Deep expertise in SQL and cloud data warehouses (preferably Snowflake ) and Cloud Services(AWS /GCP/AZURE) Proven experience with dbt for data modeling and transformation. Hands-on experience with Airflow (or similar orchestrators like Prefect, Luigi). Strong understanding of dimensional modeling, ELT best practices, and data governance principles. Ability to balance hands-on development with leadership and stakeholder management. Clear communication skills you can explain technical concepts to both technical and non-technical teams. Nice to Have Experience in the media, OTT, or sports tech domain. Familiarity with BI tools like Looker or PowerBI. Exposure to testing frameworks like dbt tests or Great Expectations
Posted 5 days ago
2.0 - 6.0 years
0 Lacs
haryana
On-site
About KPMG in India KPMG entities in India are professional services firm(s) affiliated with KPMG International Limited. Established in August 1993, KPMG leverages a global network of firms and maintains expertise in local laws, regulations, markets, and competition. With offices in major Indian cities like Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara, and Vijayawada, KPMG caters to national and international clients across various sectors. The services offered by KPMG in India are characterized by their rapid, performance-based, industry-focused, and technology-enabled approach, reflecting a deep understanding of global and local industries and the Indian business environment. Snowflake, SQL, DBT Equal employment opportunity information QUALIFICATIONS B.tech,
Posted 6 days ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
At Capgemini Invent, we believe difference drives change. As inventive transformation consultants, we blend our strategic, creative and scientific capabilities, collaborating closely with clients to deliver cutting-edge solutions. Join us to drive transformation tailored to our client's challenges of today and tomorrow. Informed and validated by science and data. Superpowered by creativity and design. All underpinned by technology created with purpose. This is an exciting opportunity to build and scale the Japanese Data capability that will support our customers in their AI & Data journey. Within Capgemini Invent Japan you will contribute to business development through leading/contributing to proposals, RFPs, bids, proposition development, client pitch contribution, client hosting at events. Deliver projects with our customers, such as: - Define Data & AI strategies aligned with our customers key challenges: operational efficiency, scalability, compliance, customer centricity - Structure data domains, associated governance and standards to fuel transformation through Data - Frame and deliver use cases to support operations on a wide array of business domains: supply chain, purchasing, quality, engineering - Enrich our customers business model and service offering through data-powered products / services Contribute to structuring / upskilling the data for operations team and associated offerings. Your Profile: - Expertise on one or several technologies and solutions: Hyperscaler (AWS, Azure, GCP), Databricks, Snowflake - Experience of building and developing a data / AI team - Currently working in a major Consulting firm, and/or in industry but having a Consulting background - Fluency in Japanese and client facing abilities with Japanese customers, willingness to travel to Japan on a need basis - Experience of proposition building and delivery - Strong business acumen, capacity to leverage a combination of personal network and company relationships network to feed and sustain a strong business pipeline What you will love about working here: - We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain a healthy work-life balance. - At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. - Equip yourself with valuable certifications in the latest technologies such as Generative AI. Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market-leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.,
Posted 6 days ago
3.0 - 8.0 years
0 Lacs
delhi
On-site
As a Snowflake Solution Architect, you will be responsible for owning and driving the development of Snowflake solutions and products as part of the COE. Your role will involve working with and guiding the team to build solutions using the latest innovations and features launched by Snowflake. Additionally, you will conduct sessions on the latest and upcoming launches of the Snowflake ecosystem and liaise with Snowflake Product and Engineering to stay ahead of new features, innovations, and updates. You will be expected to publish articles and architectures that can solve business problems for businesses. Furthermore, you will work on accelerators to demonstrate how Snowflake solutions and tools integrate and compare with other platforms such as AWS, Azure Fabric, and Databricks. In this role, you will lead the post-sales technical strategy and execution for high-priority Snowflake use cases across strategic customer accounts. You will also be responsible for triaging and resolving advanced, long-running customer issues while ensuring timely and clear communication. Developing and maintaining robust internal documentation, knowledge bases, and training materials to scale support efficiency will also be a part of your responsibilities. Additionally, you will support with enterprise-scale RFPs focused around Snowflake. To be successful in this role, you should have at least 8 years of industry experience, including a minimum of 3 years in a Snowflake consulting environment. You should possess experience in implementing and operating Snowflake-centric solutions and proficiency in implementing data security measures, access controls, and design specifically within the Snowflake platform. An understanding of the complete data analytics stack and workflow, from ETL to data platform design to BI and analytics tools is essential. Strong skills in databases, data warehouses, data processing, as well as extensive hands-on expertise with SQL and SQL analytics are required. Familiarity with data science concepts and Python is a strong advantage. Knowledge of Snowflake components such as Snowpipe, Query Parsing and Optimization, Snowpark, Snowflake ML, Authorization and Access control management, Metadata Management, Infrastructure Management & Auto-scaling, Snowflake Marketplace for datasets and applications, as well as DevOps & Orchestration tools like Airflow, dbt, and Jenkins is necessary. Possessing Snowflake certifications would be a good-to-have qualification. Strong communication and presentation skills are essential in this role as you will be required to engage with both technical and executive audiences. Moreover, you should be skilled in working collaboratively across engineering, product, and customer success teams. This position is open in all Xebia office locations including Pune, Bangalore, Gurugram, Hyderabad, Chennai, Bhopal, and Jaipur. If you meet the above requirements and are excited about this opportunity, please share your details here: [Apply Now](https://forms.office.com/e/LNuc2P3RAf),
Posted 6 days ago
6.0 - 10.0 years
0 Lacs
ahmedabad, gujarat
On-site
YipitData is a leading market research and analytics firm specializing in the disruptive economy, having recently secured a significant investment from The Carlyle Group valued over $1B. Recognized for three consecutive years as one of Inc's Best Workplaces, we are a rapidly expanding technology company with offices across various locations globally, fostering a culture centered on mastery, ownership, and transparency. As a potential candidate, you will have the opportunity to collaborate with strategic engineering leaders and report directly to the Director of Data Engineering. This role involves contributing to the establishment of our Data Engineering team presence in India and working within a global team framework, tackling challenging big data problems. We are currently in search of a highly skilled Senior Data Engineer with 6-8 years of relevant experience to join our dynamic Data Engineering team. The ideal candidate should possess a solid grasp of Spark and SQL, along with experience in data pipeline development. Successful candidates will play a vital role in expanding our data engineering team, focusing on enhancing reliability, efficiency, and performance within our strategic pipelines. The Data Engineering team at YipitData sets the standard for all other analyst teams, maintaining and developing the core pipelines and tools that drive our products. This team plays a crucial role in supporting the rapid growth of our business and presents a unique opportunity for the first hire to potentially lead and shape the team as responsibilities evolve. This hybrid role will be based in India, with training and onboarding requiring overlap with US working hours initially. Subsequently, standard IST working hours are permissible, with occasional meetings with the US team. As a Senior Data Engineer at YipitData, you will work directly under the Senior Manager of Data Engineering, receiving hands-on training on cutting-edge data tools and techniques. Responsibilities include building and maintaining end-to-end data pipelines, establishing best practices for data modeling and pipeline construction, generating documentation and training materials, and proficiently resolving complex data pipeline issues using PySpark and SQL. Collaboration with stakeholders to integrate business logic into central pipelines and mastering tools like Databricks, Spark, and other ETL technologies is also a key aspect of the role. Successful candidates are likely to have a Bachelor's or Master's degree in Computer Science, STEM, or a related field, with at least 6 years of experience in Data Engineering or similar technical roles. An enthusiasm for problem-solving, continuous learning, and a strong understanding of data manipulation and pipeline development are essential. Proficiency in working with large datasets using PySpark, Delta, and Databricks, aligning data transformations with business needs, and a willingness to acquire new skills are crucial for success. Effective communication skills, a proactive approach, and the ability to work collaboratively with stakeholders are highly valued. In addition to a competitive salary, YipitData offers a comprehensive compensation package that includes various benefits, perks, and opportunities for personal and professional growth. Employees are encouraged to focus on their impact, self-improvement, and skill mastery in an environment that promotes ownership, respect, and trust.,
Posted 6 days ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Platform Engineer at our company, you will be a key member of the Data Platform team, leveraging your expertise in data platforms, data warehousing, and data administration. Your role will involve defining and aligning data platform strategies across the organization, ensuring optimal performance of our data lake and data warehouse environment to meet business needs effectively. Responsibilities: - Define and align Data Lake and Data Warehouse usage strategies organization-wide - Design, develop, and maintain Data Lake and Data Warehouse solutions - Perform data lake and data warehouse administration tasks including user management, security, and performance tuning - Collaborate with data architects, business stakeholders, and other teams to understand data requirements and establish guidelines and processes - Define cost attribution, optimization, and monitoring strategies for the data platform - Develop and maintain data models, schemas, and database objects - Monitor and optimize data lake and data warehouse performance for high availability and reliability - Stay updated with the latest advancements in data platforms and related technologies - Provide mentorship and guidance to junior team members Minimum Qualifications: - Bachelor's degree in Computer Science, Information Technology, or related field - 5+ years of experience in Data Platform engineering - 3+ years of hands-on experience with data lake and data warehouse - Experience with cloud platforms like AWS, Data Warehouse solutions like Snowflake - Experience with provisioning and maintenance of Spark, Presto, Kubernetes clusters - Familiarity with open table formats like Iceberg, metadata stores like HMS, GDC, etc. - Strong problem-solving skills and attention to detail - Excellent communication and collaboration skills Preferred Qualifications: - Snowflake Experience - Proficiency in coding languages such as Python - Familiarity with data visualization tools like Looker - Experience with Agile/Scrum methodologies Join us at Autodesk, where every day brings new opportunities to create amazing things with our software. Embrace our culture that values diversity, belonging, and innovation, and be part of a team that shapes the future. Discover a rewarding career that helps build a better world for all.,
Posted 6 days ago
7.0 - 11.0 years
0 Lacs
pune, maharashtra
On-site
You will be joining Atgeir Solutions, a leading innovator in technology, renowned for its commitment to excellence. As a Technical Lead specializing in Big Data and Cloud technologies, you will have the opportunity for advancement to the role of Technical Architect. Your responsibilities will include leveraging your expertise in Big Data and Cloud technologies to contribute to the design, development, and implementation of complex systems. You will lead and inspire a team of professionals, offering technical guidance and mentorship to foster a collaborative and innovative work environment. In addition, you will be tasked with solving intricate technical challenges and guiding your team in overcoming obstacles in Big Data and Cloud environments. Investing in the growth and development of your team members will be crucial, including identifying training needs, organizing knowledge-sharing sessions, and promoting a culture of continuous learning. Collaboration with stakeholders, such as clients, architects, and other leads, will be essential to understand requirements and align technology strategies with business goals, particularly in the realm of Big Data and Cloud. To qualify for this role, you should hold a Bachelor's or Master's degree in Computer Science, Engineering, or a related field, along with 7-10 years of experience in software development. A proven track record of technical leadership in Big Data and Cloud environments is required. Proficiency in technologies like Hadoop, Spark, GCP, AWS, and Azure is essential, with knowledge of Databricks/Snowflake considered an advantage. Strong communication and interpersonal skills are necessary to convey technical concepts to various stakeholders effectively. Upon successful tenure as a Technical Lead, you will have the opportunity to progress into the role of Technical Architect. This advancement will entail additional responsibilities related to system architecture, design, and strategic technical decision-making, with a continued focus on Big Data and Cloud technologies.,
Posted 6 days ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
You are a strategic thinker passionate about driving solutions in financial analysis. You have found the right team. As a Data Domain Architect Lead - Vice President within the Finance Data Mart team, you will be responsible for overseeing the design, implementation, and maintenance of data marts to support our organization's business intelligence and analytics initiatives. You will collaborate with business stakeholders to gather and understand data requirements, translating them into technical specifications. Leading the development of robust data models to ensure data integrity and consistency, you will oversee the implementation of ETL processes to populate data marts with accurate and timely data. Your role will involve optimizing data mart performance and scalability, ensuring high availability and reliability, while mentoring and guiding a team of data mart developers. Responsibilities: - Lead the design and development of data marts, ensuring alignment with business intelligence and reporting needs. - Collaborate with business stakeholders to gather and understand data requirements, translating them into technical specifications. - Develop and implement robust data models to support data marts, ensuring data integrity and consistency. - Oversee the implementation of ETL processes to populate data marts with accurate and timely data. - Optimize data mart performance and scalability, ensuring high availability and reliability. - Monitor and troubleshoot data mart issues, providing timely resolutions and improvements. - Document data mart structures, processes, and procedures, ensuring knowledge transfer and continuity. - Mentor and guide a team of data mart developers if needed, fostering a collaborative and innovative work environment. - Stay updated with industry trends and best practices in data warehousing, data modeling, and business intelligence. Required qualifications, capabilities, and skills: - Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. - Extensive experience in data warehousing, data mart development, and ETL processes. - Strong expertise in Data Lake, data modeling, and database management systems (e.g., Databricks, Snowflake, Oracle, SQL Server, etc.). - Leadership experience, with the ability to manage and mentor a team. - Excellent problem-solving skills and attention to detail. - Strong communication and interpersonal skills to work effectively with cross-functional teams. Preferred qualifications, capabilities, and skills: - Experience with cloud-based data solutions (e.g., AWS, Azure, Google Cloud). - Familiarity with advanced data modeling techniques and tools. - Knowledge of data governance, data security, and compliance practices. - Experience with business intelligence tools (e.g., Tableau, Power BI, etc.). Candidates must be able to physically work in our Bengaluru Office in the evening shift from 2 PM to 11 PM IST. The specific schedule will be determined and communicated by direct management.,
Posted 6 days ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
This role involves leading the design and development of Global Supply Chain Advanced Analytics applications, requiring hands-on involvement in strategic priorities within the Supply Chain and Operations domain. We are looking for a Solution Design Expert to drive key digital initiatives in the Supply Chain Data & Analytics DSAI team, lead multiple advanced analytics projects, and collaborate with various global stakeholders such as Business Owners, Architecture teams, Data Scientists, DevOps, Security, Integration Factory, and Compliance team to ensure successful solution delivery. Major responsibilities include designing and developing highly scalable, robust, reusable, and maintainable Analytics and Machine Learning solutions for the Supply Chain organization across multiple strategic initiatives. You will participate in fit-gap workshops, provide effort estimates and solution proposals, develop and maintain code repositories, collaborate extensively with colleagues across different functions, and drive data enablement according to business priorities and technology alignment. Key Performance Indicators for this role include delivery on agreed KPIs, launching innovative technology solutions across Novartis at scale, generating business impact and value from DDIT solutions, adopting Agile Productization and DevOps practices, ensuring operational stability and effective risk management, receiving feedback on customer experience, and ensuring applications adhere to ISC requirements and are audit-ready. The ideal candidate should have at least 8 years of prior programming experience in a professional environment, experience in working with complex landscapes and various integration capabilities, customer-facing IT roles, strong data engineering fundamentals, hands-on exposure in Data & Analytics solutions including Artificial Intelligence and Machine Learning projects, and preferably an AWS certification. Proficiency in Data and Analytics technology and solutions, experience in various Machine Learning use cases, exposure to external datasets, various AWS services, data warehouse, data lake, and hands-on expertise in tools like SQL, Python, Databricks, Snowflake, and Power BI are essential skills required for this role. Fluency in English is necessary for this position. Novartis is committed to helping people with diseases and their families by fostering a community of smart, passionate individuals who collaborate, support, and inspire each other to achieve breakthroughs that change patients" lives. If you are ready to contribute to creating a brighter future together, consider joining our Novartis Network to stay connected and explore suitable career opportunities in the future. For more information on benefits and rewards, you can read our handbook at [Novartis Benefits and Rewards](https://www.novartis.com/careers/benefits-rewards).,
Posted 6 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough