Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5 - 7 years
4 - 9 Lacs
Pune
Work from Office
Roles and Responsibilities Design, develop, test, deploy, and maintain Snowflake tables/views/stored procedures/streams/tasks using SQL. Extract data from Snowflake tables into JSON format using SnowSQL. Create dashboards in Power BI to visualize extracted data. Collaborate with cross-functional teams to gather requirements and deliver solutions on time. Troubleshoot issues related to Snowflake database performance optimization.
Posted 2 months ago
8 - 13 years
27 - 42 Lacs
Gurgaon
Work from Office
Full Stack Application Developer Wanted We're looking for a passionate, cutting-edge developer who lives and breathes code and is ready to make a real impact on our diverse, global team. If you have a flair for innovative problem-solving and a proven track record (GitHub! ), we want to hear from you! What You'll Do: Develop & Innovate Build and maintain modern full stack applications using technologies like React, and Node.js Integrate state-of-the-art libraries and tools from our stack (e.g., react-router, ag-grid, vite, redux, and more). AWS & Infrastructure Interaction Work with AWS services including Cloudfront, EC2 (Ubuntu), and S3 logging in when needed to write scripts, update security keys, manage environment variables, and update packages. CI/CD & Linux Mastery Set up and maintain CI/CD pipelines (GitLab experience a plus, but git fundamentals are key). Leverage your solid Linux know-how to streamline development and deployment processes. Database & ORM Expertise Develop database tables and views in Microsoft SQL Server . Know how to read an execution plan. Demonstrate working knowledge of ORMs (e.g., Sequelize) to interface seamlessly with our data layers. Working knowledge of cloud data warehouses like Snowflake, Redshift, or BigQuery. We use Snowflake. Team Collaboration & Communication Thrive in a culturally diverse, remote team environment. Bring native-level English skills and the ability to understand not just direct instructions, but the underlying goals behind tasks. What We're Looking For: Passion & Proven Experience You truly love coding and have personal projects or a GitHub repo to prove it. 8+ years of web-based development experience Stay updated with the latest trends and developments in full stack technologies. Technical Prowess Solid front-end expertise with React and its ecosystem. Proficiency in back-end development with Node.js, Express, and related tooling. Hands-on experience with AWS, CI/CD processes, Linux environments, writing SQL, and ORM integration. Availability: Must be available during core working hours that overlap with EST/PST time zones. Our daily standups are at 11AM EST. Database & SQL Skills Advanced knowledge of Microsoft SQL Server SQL and ORM integration. Collaborative Spirit Excellent teamwork, communication skills, and the ability to take direction and offer innovative solutions.
Posted 2 months ago
7 - 12 years
9 - 17 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally
Posted 2 months ago
10 - 13 years
25 - 37 Lacs
Gurgaon, Noida
Work from Office
We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000 experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That is where you come in! REQUIREMENTS: Total experience 10+years Experience in data engineering and database management. Expert knowledge in PostgreSQL (preferably cloud-hosted on AWS, Azure, or GCP). Experience with Snowflake Data Warehouse and strong SQL programming skills. Deep understanding of stored procedures, performance optimization, and handling large-scale data. Knowledge of ingestion techniques, data cleaning, de-duplication, and partitioning. Strong understanding of index design and performance tuning techniques. Familiarity with SQL security techniques, including data encryption, Transparent Data Encryption (TDE), signed stored procedures, and user permission assignments. Competence in data preparation and ETL tools to build and maintain data pipelines and flows. Experience in data integration by mapping various source platforms into Entity Relationship Models (ERMs). Exposure to source control systems like Git, Azure DevOps. Expertise in Python and Machine Learning (ML) model development. Experience in automated testing and test coverage tools. Hands-on experience in CI/CD automation tools Programming experience in Golang Understanding of Agile methodologies (Scrum, Kanban). Ability to collaborate with stakeholders across Executive, Product, Data, and Design teams. RESPONSIBILITIES: Design and maintain an optimal data pipeline architecture. Assemble large, complex data sets to meet functional and non-functional business requirements. Develop pipelines for data extraction, transformation, and loading (ETL) using SQL and cloud database technologies. Prepare and optimize ML models to improve business insights. Support stakeholders by resolving data-related technical issues and enhancing data infrastructure. Ensure data security across multiple data centers and regions, maintaining compliance with national and international data laws. Collaborate with data and analytics teams to enhance data systems functionality. Conduct exploratory data analysis to support database and dashboard development.
Posted 2 months ago
4 - 6 years
3 - 6 Lacs
Pune
Work from Office
Key Responsibilities: Design and develop data pipelines using Linux shell scripting (Bash, Perl, etc.) Work with Snowflake and Teradata databases to optimize data models, queries, and performance
Posted 2 months ago
3 - 8 years
5 - 9 Lacs
Bengaluru
Work from Office
About The Role : Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Must have technical skills: 4 years+ on Snowflake advanced SQL expertise 4 years+ on data warehouse experiences hands on knowledge with the methods to identify, collect, manipulate, transform, normalize, clean, and validate data, star schema, normalization / denormalization, dimensions, aggregations etc, 4+ Years experience working in reporting and analytics environments development, data profiling, metric development, CICD, production deployment, troubleshooting, query tuning etc, 3 years+ on Python advanced Python expertise 3 years+ on any cloud platform AWS preferred hands on experience on AWS on Lambda, S3, SNS / SQS, EC2 is bare minimum, 3 years+ on any ETL / ELT tool Informatica, Pentaho, Fivetran, DBT etc. 3+ years with developing functional metrics in any specific business vertical (finance, retail, telecom etc), Must have soft skills: Clear communication written and verbal communication, especially with time off, delays in delivery etc. Team Player Works in the team and works with the team, Enterprise Experience Understands and follows enterprise guidelines for CICD, security, change management, RCA, on-call rotation etc, Nice to have: Technical certifications from AWS, Microsoft, Azure, GCP or any other recognized Software vendor, 4 years+ on any ETL / ELT tool Informatica, Pentaho, Fivetran, DBT etc. 4 years+ with developing functional metrics in any specific business vertical (finance, retail, telecom etc), 4 years+ with team lead experience, 3 years+ in a large-scale support organization supporting thousands of users, Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries, Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Competencies Client Centricity Passion for Results Execution Excellence Problem Solving & Decision Making Effective communication
Posted 2 months ago
9 - 14 years
20 - 35 Lacs
Bengaluru
Remote
Snowflake Architect Job Location: Hyderabad / Bangalore / Chennai / Kolkata / Noida/ Gurgaon / Pune / Indore / Mumbai Job Details Technical Expertise: Strong proficiency in Snowflake architecture, including data sharing, partitioning, clustering, and materialized views. Advanced experience with DBT for data transformations and workflow management. Expertise in Azure services, including Azure Data Factory, Azure Data Lake, Azure Synapse, and Azure Functions. Data Engineering: Proficiency in SQL, Python, or other relevant programming languages. Strong understanding of data modeling concepts, including star schema and normalization. Hands-on experience with ETL/ELT pipelines and data integration tools. Soft Skills: Excellent problem-solving and analytical skills. Strong communication and stakeholder management abilities. Ability to work in agile teams and handle multiple priorities. Preferred Qualifications: Certifications in Snowflake, DBT, or Azure Data Engineering. Familiar with data visualization tools like Power BI or Tableau. Knowledge of CI/CD pipelines and DevOps practices for data workflows.
Posted 2 months ago
6 - 8 years
19 - 25 Lacs
Pune
Work from Office
About Zscaler Serving thousands of enterprise customers around the world including 40% of Fortune 500 companies, Zscaler (NASDAQ: ZS) was founded in 2007 with a mission to make the cloud a safe place to do business and a more enjoyable experience for enterprise users. As the operator of the world’s largest security cloud, Zscaler accelerates digital transformation so enterprises can be more agile, efficient, resilient, and secure. The pioneering, AI-powered Zscaler Zero Trust Exchange™ platform, which is found in our SASE and SSE offerings, protects thousands of enterprise customers from cyberattacks and data loss by securely connecting users, devices, and applications in any location. Named a Best Workplace in Technology by Fortune and others, Zscaler fosters an inclusive and supportive culture that is home to some of the brightest minds in the industry. If you thrive in an environment that is fast-paced and collaborative, and you are passionate about building and innovating for the greater good, come make your next move with Zscaler. Our General and Administrative teams help to support and scale our great company. Whether striving to grow our workforce, nurture an amazing culture and work environment, support our financial and legal operations, or maintain our global infrastructure, the G&A team provides a solid foundation for growth. Put your passion and expertise to work with the world's cloud security leader. We're looking for an experienced Deputy Manager - Finance Transformation to join our Finance Transformation and Processes Optimization Team. This role will involve collaborating with cross-functional teams, providing strategic insights, and driving continuous improvements. You will be responsible for: Assessing end-to-end Finance processes to identify improvement and automation opportunities, prioritize identified opportunities and develop a plan for delivery Leading the identification, prioritization / planning, design and development of Finance Process Automation, using RPA and other tools like Power Query / VBA / Alteryx Understanding current state of our Finance processes and tech stack, including ERP (NetSuite), FloQast, Workday, Coupa, Ceridian, RPA, etc Working with internal teams to gather requirements, drive design, test solution, build training, implement identified technology solutions and controls during automation and process changes Building / maintaining documentation and training material to institutionalize knowledge and identify areas to integrate/ improve or technology and process landscape What We're Looking for (Minimum Qualifications) 6-8 years of experience leading and delivering technology enabled Finance transformations Hands-on Finance Process Automation experience: end to end design and development of RPA solutions (preferably UI Path / Power Query / VBA / Alteryx) Knowledge of end-to-end Finance processes such as RTR, O2C, STP, FP&A, Tax and Treasury processes What Will Make You Stand Out (Preferred Qualifications) Experience in successful implementation of tools and technology supporting Finance processes, data, and architecture field Accounting background, knowledge in SaaS business and hands-on experience with NetSuite, Workday, Coupa, Adaptive, Saleforce Strong data analytics experience (preferred tools: Tableau, Snowflake) and strong project management skills (PMP or equivalent certification is a plus) #LI-NT1 #LI-Hybrid At Zscaler, we believe that diversity drives innovation, productivity, and success. We are looking for individuals from all backgrounds and identities to join our team and contribute to our mission to make doing business seamless and secure. We are guided by these principles as we create a representative and impactful team, and a culture where everyone belongs. For more information on our commitments to Diversity, Equity, Inclusion, and Belonging, visit the Corporate Responsibility page of our website. Our Benefits program is one of the most important ways we support our employees. Zscaler proudly offers comprehensive and inclusive benefits to meet the diverse needs of our employees and their families throughout their life stages, including: Various health plans Time off plans for vacation and sick time Parental leave options Retirement options Education reimbursement In-office perks, and more! By applying for this role, you adhere to applicable laws, regulations, and Zscaler policies, including those related to security and privacy standards and guidelines. Zscaler is proud to be an equal opportunity and affirmative action employer. We celebrate diversity and are committed to creating an inclusive environment for all of our employees. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex (including pregnancy or related medical conditions), age, national origin, sexual orientation, gender identity or expression, genetic information, disability status, protected veteran status or any other characteristics protected by federal, state, or local laws. See more information by clicking on the Know Your Rights: Workplace Discrimination is Illegal link. Pay Transparency Zscaler complies with all applicable federal, state, and local pay transparency rules. For additional information about the federal requirements, click here . Zscaler is committed to providing reasonable support (called accommodations or adjustments) in our recruiting processes for candidates who are differently abled, have long term conditions, mental health conditions or sincerely held religious beliefs, or who are neurodivergent or require pregnancy-related support.
Posted 2 months ago
4 - 9 years
20 - 35 Lacs
Bhubaneshwar, Pune, Bengaluru
Work from Office
Must have 5+ years of IT experience, Snowflake developer with Python knowledge and datawarehouse background ETL : Python, Data as a service (API), Batch & stream data processing using AWS platform, SQL, Shell
Posted 2 months ago
8 - 10 years
20 - 30 Lacs
Hyderabad, Gurgaon, Jaipur
Work from Office
Roles and Responsibilities : Develop and maintain the product roadmap for Data Warehousing solutions, ensuring alignment with business objectives. Collaborate with cross-functional teams to gather requirements, prioritize features, and deliver high-quality products on time. Own the end-to-end lifecycle of new features and enhancements, from concept to delivery. Analyze market trends, competitor analysis, and customer feedback to inform product decisions. Job Requirements : 8-10 years of experience in Product Ownership or similar role in IT Services & Consulting industry. Strong understanding of bigdata frameworks such as Hadoop/Hive/Pig/Spark/Data Bricks etc. . Experience working with Snowflake or other cloud-based data warehouses is an added advantage. Strong domain expertise in pharmaceutical/ manufacturing/ supply chain and good technical understanding of Data metrics. Detail-oriented, unfazed by technical detail, committed to flawless execution. Excellent communication and stakeholder management skills Solid track record of having shipped products & features in an agile, fast paced environment. Ability to participate in technical discussions and help make technical trade-offs. Excellent verbal and written communication skills and ability to convey a product strategy to partners and senior leaders.
Posted 2 months ago
8 - 10 years
20 - 30 Lacs
Chennai, Pune, Bengaluru
Work from Office
Job Description Responsibilities Document high-level requirements, acceptance criteria and KPI’s that push the product strategy forward and achieve key objectives Maintain epics and user stories, work with the team for prioritizing, grooming and provide clarity and direction to the development team Excellent stakeholder management skills and good communication skills Communicate across the organization to ensure alignment and clarity of what is being developed and why Assist in creating a strategy behind the why, when, and what (and what not) for the product Guide product teams with diverse range of skill sets, locations and direct and in-direct reporting relationships Make data-informed decisions based on a sound understanding of organizational priorities, Voice of Customer, analytics, benchmarks, industry reporting and emerging trends Skills needed 8-10 years of end-to-end product management skills Strong experience in product discovery/ requirement gathering. Strong domain expertise in pharmaceutical/ manufacturing/ supply chain and good technical understanding of Data metrics. Detail-oriented, unfazed by technical detail, committed to flawless execution. Excellent communication and stakeholder management skills Solid track record of having shipped products & features in an agile, fast paced environment. Ability to participate in technical discussions and help make technical trade-offs. Excellent verbal and written communication skills and ability to convey a product strategy to partners and senior leaders.
Posted 2 months ago
2 - 4 years
4 - 6 Lacs
Chennai
Work from Office
Bounteous x Accolite is a premier end-to-end digital transformation consultancy dedicated to partnering with ambitious brands to create digital solutions for todays complex challenges and tomorrows opportunities. With uncompromising standards for technical and domain expertise, we deliver innovative and strategic solutions in Strategy, Analytics, Digital Engineering, Cloud, Data & AI, Experience Design, and Marketing. Our Co-Innovation methodology is a unique engagement model designed to align interests and accelerate value creation. Our clients worldwide benefit from the skills and expertise of over 4,000+ expert team members across the Americas, APAC, and EMEA. By partnering with leading technology providers, we craft transformative digital experiences that enhance customer engagement and drive business success. About Bounteous ( https://www.bounteous.com/ ). Founded in 2003 in Chicago, Bounteous is a leading digital experience consultancy that co-innovates with the world's most ambitious brands to create transformative digital experiences. With services in Strategy, Experience Design, Technology, Analytics and Insight, and Marketing, Bounteous elevates brand experiences through technology partnerships and drives superior client outcomes. For more information, please visit www.bounteous.com. Information Security Responsibilities. Promote and enforce awareness of key information security practices, including acceptable use of information assets, malware protection, and password security protocols. Identify, assess, and report security risks, focusing on how these risks impact the confidentiality, integrity, and availability of information assets. Understand and evaluate how data is stored, processed, or transmitted, ensuring compliance with data privacy and protection standards (GDPR, CCPA, etc.). Ensure data protection measures are integrated throughout the information lifecycle to safeguard sensitive information. Preferred Qualifications. 7+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field. Working knowledge of ETL technology Talend / Apache Ni-fi / AWS Glue. Experience with relational SQL and NoSQL databases. Experience with big data toolsHadoop, Spark, Kafka, etc. (Nice to have). Advanced Alteryx Designer (Mandatory at this point relaxing that would be tough). Tableau Dashboarding. AWS (familiarity with Lambda, EC2, AMI). Experience with data pipeline and workflow management toolsAzkaban, Luigi, Airflow, etc. (Nice to have). Experience with cloud servicesEMR, RDS, Redshift or Snowflake. Experience with stream-processing systemsStorm, Spark-Streaming, etc.(Nice to have). Experience with object-oriented/object function scripting languagesPython, Java, Scala, etc. Responsibilities. Work with Project Managers, Senior Architects and other team members from Bounteous & Client teams to evaluate data systems and project requirements. In cooperation with platform developers, develop scalable and fault-tolerant Extract Transform Load (ETL) and integration systems for various data platforms which can operate at appropriate scale; meeting security, logging, fault tolerance and alerting requirements. Work on Data Migration Projects. Effectively communicate data requirements of various data platforms to team members. Evaluate and document existing data ecosystems and platform capabilities. Configure CI/CD pipelines. Implement proposed architecture and assist in infrastructure setup. We invite you to stay connected with us by subscribing to our monthly job openings alert here . Research shows that women and other underrepresented groups apply only if they meet 100% of the criteria of a job posting. If you have passion and intelligence, and possess a technical knack (even if youre missing some of the above), we encourage you to apply. Bounteous x Accolite is focused on promoting an inclusive environment and is proud to be an equal opportunity employer. We celebrate the different viewpoints and experiences our diverse group of team members bring to Bounteous x Accolite. Bounteous x Accolite does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, physical or mental disability, national origin, veteran status, or any other status protected under federal, state, or local law. In addition, you have the opportunity to participate in several Team Member Networks, sometimes referred to as employee resource groups (ERGs), that host space with individuals with shared identities, interests, and passions. Our Team Member Networks celebrate communities of color, life as a working parent or caregiver, the 2SLGBTQIA+ community, wellbeing, and more. Regardless of your respective identity, there are various avenues we involve team members in the Bounteous x Accolite community. Bounteous x Accolite is willing to sponsor eligible candidates for employment visas
Posted 2 months ago
7 - 12 years
35 - 45 Lacs
Pune
Remote
Design ,develop &maintain ETL processes using DataStage for data integration & transformation Develop & optimize complex SQL queries Utilize Azure Data Factory (ADF) for data integration and orchestration Work with Datastage for data warehousing Required Candidate profile Experience with DataStage for ETL development. Experience with Azure Data Factory (ADF) for data integration. Experience with data warehousing and relational database design.
Posted 2 months ago
4 - 9 years
10 - 18 Lacs
Bengaluru, Mumbai (All Areas)
Work from Office
R We are looking for a passionate Data Engineer with a strong background in Azure, Python, and SQL. The ideal candidate will have at least 4 years of relevant experience and will be based in Bangalore or Mumbai. You will have the opportunity to work with a leading organization and contribute to their data engineering and analytics initiatives. Location: Bangalore/Mumbai Your Future Employer : Our client is a leading organization in the [specific industry/sector] and is known for its innovative and forward-thinking approach. They offer a collaborative work environment and provide ample opportunities for professional growth and skill development. Responsibilities: Design, build, and maintain scalable data pipelines and architectures on Azure platform Collaborate with cross-functional teams to understand data requirements and develop solutions Optimize and troubleshoot data processes for performance and reliability Implement best practices for data security and compliance Contribute to the development and maintenance of data models and frameworks Stay updated with the latest trends and technologies in data engineering and analytics Requirements: Bachelor's degree in Computer Science, Engineering, or related field 4+ years of experience in data engineering, with strong proficiency in Azure, Python, and SQL Hands-on experience with building and optimizing big data pipelines and architectures Knowledge of data modeling, ETL processes, and data warehousing concepts Strong communication and teamwork skills Relevant certifications in Azure and data engineering will be a plus What's in it for you: Competitive compensation package Opportunity to work with a reputable organization and contribute to impactful projects Professional development and training opportunities Collaborative and inclusive work culture Reach us : If you feel this opportunity is well aligned with your career progression plans, please feel free to reach me with your updated profile at isha.joshi@crescendogroup.in Disclaimer : Crescendo Global specializes in Senior to C-level niche recruitment. We are passionate about empowering job seekers and employers with an engaging memorable job search and leadership hiring experience. Crescendo Global does not discriminate on the basis of race, religion, color, origin, gender, sexual orientation, age, marital status, veteran status, or disability status. Note : We receive a lot of applications on a daily basis so it becomes a bit difficult for us to get back to each candidate. Please assume that your profile has not been shortlisted in case you don't hear back from us in 1 week. Your patience is highly appreciated. Profile keywords : Analytics, Data Engineering, Data engineer, Azure Datafactory, Data bricks, Logic apps, SQL, Azure Synapse
Posted 2 months ago
2 - 5 years
10 - 20 Lacs
Bengaluru, Gurgaon, Mumbai (All Areas)
Work from Office
Job Description_ Data Engineer _ TransOrg Analytics Why would you like to join us? TransOrg Analytics specializes in Data Science, Data Engineering and Generative AI, providing advanced analytics solutions to industry leaders and Fortune 500 companies across India, US, APAC and the Middle East. We leverage data science to streamline, optimize, and accelerate our clients' businesses. Visit at www.transorg.com to know more about us. Responsibilities: Design, develop, and maintain robust data pipelines using Azure Data Factory and Databricks workflows. Develop an integrated data solution in Snowflake to unify data. Implement and manage big data solutions using Azure Databricks. Design and maintain relational databases using Azure Delta Lake. Ensure data quality and integrity through rigorous testing and validation. Monitor and troubleshoot data pipelines and workflows to ensure seamless operation. Implement data security and compliance measures in line with industry standards. Continuously improve data infrastructure (including CI/CD) for scalability and performance. Design, develop, and maintain ETL processes to extract, transform, and load data from various sources into Snowflake. Utilize ETL tools (e.g., ADF, Talend) to automate and manage data workflows. Develop and maintain CI/CD pipelines using GitHub and Jenkins for automated deployment of data models and ETL processes. Monitor and troubleshoot pipeline issues to ensure smooth deployment and integration. Design and implement scalable and efficient data models in Snowflake. Optimize data structures for performance and storage efficiency. Collaborate with stakeholders to understand data requirements and ensure data integrity Integrate multiple data sources to create data lake/data mart Perform data ingestion and ETL processes using SQL, Scoop, Spark or Hive Monitor job performances, manage file system/disk-space, cluster & database connectivity, log files, manage backup/security and troubleshoot various user issues Design, implement, test and document performance benchmarking strategy for platforms as well as for different use cases Setup, administer, monitor, tune, optimize and govern large scale implementations Drive customer communication during critical events and participate/lead various operational improvement initiatives Qualifications, Skill Set and competencies: Bachelor's in Computer Science, Engineering, Statistics, Maths or related quantitative degree. 2 - 5 years of relevant experience in data engineering. Must have worked on any of the cloud engineering platforms - AWS, Azure, GCP, Cloudera Proven experience as a Data Engineer with a focus on Azure cloud technologies/Snowflake. Strong proficiency in Azure Data Factory, Azure Databricks, ADLS, and Azure SQL Database. Experience with big data processing frameworks like Apache Spark. Expert level proficiency in SQL and experience with data modeling and database design. Knowledge of data warehousing concepts and ETL processes. Strong focus on PySpark, Scala and Pandas. Proficiency in Python programming and experience with other data processing frameworks. Solid understanding of networking concepts and Azure networking solutions. Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Azure Data Engineer certification AZ-900 and DP-203 (Good to have) Familiarity with DevOps practices and tools for CI/CD in data engineering. Certification: MS Azure / DBR Data Engineer (Good to have) Data Ingestion - Coding & automating ETL pipelines, both batching & streaming. Should have worked on both ETL or ELT methodologies using any of traditional & new age tech stack- SSIS, Informatica, Databricks, Talend, Glue, DMS, ADF, Spark, Kafka, Storm, Flink etc. Data transformation - Experience working with MPPs, big data & distributed computing frameworks on cloud or cloud agnostic tech stack- Databricks, EMR, Hadoop, DBT, Spark etc, Data storage - Experience working on data lakes, lakehouse architecture- S3, ADLS, Blob, HDFS DWH - Strong experience modelling & implementing DWHing on tech like Redshift, Snowflake, Azure Synapse, Bigquery, Hive Orchestration & lineage - Airflow, Oozie etc.
Posted 2 months ago
5 - 10 years
8 - 18 Lacs
Bengaluru, Hyderabad
Hybrid
Job Description for QA Engineer: 6-10 years of experience in ETL Testing, Snowflake, DWH Concepts. Strong SQL knowledge & debugging skills are a must. Experience in Azure and Snowflake Testing is plus Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus Strong Data warehousing Concepts, ETL tools like Talend Cloud Data Integration, Pentaho/Kettle tool Experience in JIRA, Xray defect management toolis good to have. Exposure to the financial domain knowledge is considered a plus. Testing the data-readiness (data quality) address code or data issues Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to effectively and efficiently identify root cause of code/data issues and come up with a permanent solution Prior experience with State Street and Charles River Development (CRD) considered a plus Experience in tools such as PowerPoint, Excel, SQL Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus Key Attributes include: Team player with professional and positive approach Creative, innovative and able to think outside of the box Strong attention to detail during root cause analysis and defect issue resolution Self-motivated & self-sufficient Effective communicator both written and verbal Brings a high level of energy with enthusiasm to generate excitement and motivate the team Able to work under pressure with tight deadlines and/or multiple projects Experience in negotiation and conflict resolution
Posted 2 months ago
5 - 10 years
20 - 25 Lacs
Hyderabad
Work from Office
Snowflake Matillion Developer _ Reputed US based IT MNC If you are a Snowflake Matillion Developer, Email your CV to jagannaath@kamms.net Whatsapp your CV to 70926 89999 Experience : 5 Years + (Must be 100% real time experience can apply) Role : Snowflake Matillion Developer Preferred : Snowflake certifications (SnowPro Core/Advanced) Position Type: Full time/ Permanent Location : Hyderabad ( Work from Office) Notice Period: Immediate to 15 Days Working hours : 04:30PM 12:30AM Salary: As per your experience Interviews: 1. Technical online 2. Final face to face Responsibilities: 5+ years of experience in data engineering, ETL, and Snowflake development. Strong expertise in Snowflake SQL scripting, performance tuning, data warehousing concepts. Hands-on experience with Matillion ETL – building & maintaining Matillion jobs. Strong knowledge of cloud platforms (AWS/Azure/GCP) and cloud-based data architecture. Proficiency in SQL, Python, or scripting languages for automation & transformation. Experience with API integrations & data ingestion frameworks. Understanding of data governance, security policies, and access control in Snowflake. Excellent communication skills – ability to interact with business and technical stakeholders. Self-starter who can work independently and drive projects to completion
Posted 2 months ago
2 - 4 years
4 - 6 Lacs
Noida
Work from Office
Responsibilities : Configure the Intake solution. Communicates with the deployment lead. Upholds build standards. Motivates stakeholders to execute operational change and achieve desired results. Assists in creating, implementing, and upholding standard processes and workflows. Collaborates with stakeholders to design educational materials and coordinates training activities to prepare for operational readiness. Assesses applicable patient experience initiative enhancements and expansion. Demonstrates an ability to identify opportunities to gain efficiencies relating to patient experience initiatives. Required Qualifications : Bachelor's Degree Business management or healthcare experience Excellent organization and attention to detail skills Highly developed verbal and written communication skills Excellent decision-making, communication and collaboration skills with proven cross-functional and multi-level relationship building skills Strong analytical skills and ability Ability to solve problems outside of area of expertise Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visit: r1rcm.com
Posted 2 months ago
4 - 7 years
3 - 5 Lacs
Pune
Work from Office
Position: SQL Developer Employment Type: Full Time Location: Pune, India Salary: TBC Work Experience: Applicant for this position should have 4+ years working as a SQL developer. Project Overview: The project will use a number of Microsoft SQL Server technologies and include development and maintenance of reports, APIs and other integrations with external financial systems. The successful applicant will liaise with other members of the team and will be expected to work on projects where they are the sole developer as well as part of a team on larger projects. The applicant will report to the SQL Development Manager Job Description: Ability to understand requirements clearly and communicate technical ideas to both technical stakeholders and business end users. Investigate and resolve issues quickly. Communication with end users. Working closely with other team members to understand business requirements. Complete structure analysis and systematic testing of the data. Skills: Microsoft SQL Server 2016 2022. T-SQL programming (4+ years) experience. Query/Stored Procedure performance tuning. SQL Server Integration Service. SQL Server Reporting Services. Experience in database design. Experience with source control. Knowledge of software engineering life cycle. Previous experience in designing, developing, testing, implementing and supporting software. 3rd Level IT Qualification. SQL MSCA or MSCE preferable. Knowledge of data technologies such as SnowFlake, Airflow, ADF desirable Other skills Ability to work on own initiative and as part of a team. Excellent time management and decision making skills. Excellent communication skills in both English written and verbal. Background in the financial industry preferable. Academic Qualification: Any graduation or post graduate. Any specialization in IT.
Posted 2 months ago
3 - 8 years
5 - 10 Lacs
Bengaluru
Work from Office
About The Role General Skills Good Interpersonal skill and ability tomanage multiple tasks with enthusiasm. Interact with clients to understand therequirements. 4 to 8 years of total IT experience, with min3+ Power BI Technical Skills Understand business requirements in MSBIcontext and design data models to transform raw data into meaningful insights. Awareness of star and snowflake schemas inDWH and DWH concepts Should be familiar and experienced in T-SQL Have good knowledge and experience in SSIS(ETL) Creation of Dashboards & VisualInteractive Reports usingPower BI Extensive experience in both Power BI Service& Power BI On-Premise environment Createrelationships between data sources and develop data models accordingly Experiencein implementing Tabular model implementation and row level data security. Experiencein writing and optimizing DAX queries. Experiencein Power Automate Flow Performancetuning and optimization of Power BI reports Good understanding of Data warehouseconcepts. Knowledge of Microsoft Azure analytics is aplus. Good to have Azure BIskills (ADF, ADB, Synapse) Good UI/UXexperience / knowledge Knowledge in Tabular Models Knowledge in Paginated Reports General Skills GoodInterpersonal skill and ability to manage multiple tasks with enthusiasm Interactwith clients to understand the requirements Up to dateknowledge about the best practices and advancements in Power BI Shouldhave an analytical and problem solving mindset and approach 6 to 10years of total IT experience, with min 3+ years in Power BI Technical Skills Understandbusiness requirements in BI context and design data models to transform rawdata into meaningful insights Goodknowledge on all variants of Power BI (Power BI Embedded, Power BI Service,Power BI Report Server) Strong SQLskills and SQL Performance tuning Provideexpertise in Data Modeling and Database Design and provide recommendations Makesuggestions / best practices in implementing data models, ETL packages, OLAPcubes, and Reports Experienceworking with direct query and import mode Expertisein implementing static & dynamic Row level security Knowledgeto integrate Power BI reports in external web applications Shouldhave experience setting up data gateways & data preparation Creationof Dashboards & Visual Interactive Reports using Power BI Experienceworking with third party custom visuals like Zebra BI etc. Createrelationships between data sources and develop data models accordingly Have goodknowledge of the various DAX functions and ability to write complex DAX queries Awarenessof star and snowflake schemas in DWH and DWH concepts Knowledgein Tabular Models Befamiliar with creating TSQL objects, scripts, views, and stored procedure check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#6875E2;border-color:#6875E2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 2 months ago
8 - 13 years
10 - 15 Lacs
Hyderabad
Work from Office
About The Role ETL Architect [Data Analytics ETL + ADF (Azure Data Factory)] 8+ years total, 4+ relevant exp Must Have Informatica, ODI, Azure Data Factory, SQL, Data modeling, ETL Design & architecture, Performance Tuning Optional Snowflake, Oracle, PowerBI, Python
Posted 2 months ago
2 - 7 years
4 - 9 Lacs
Andhra Pradesh
Work from Office
Description Primary/Mandatory Experience - Experience building data pipeline in Python along with AWS services (S3 SNS CloudWatch Lambda Step Function etc) - Proficient in AWS Server less technologies - Technical knowledge of Extract/Transform/Load (ETL) solutions for running analytic projects on the cloud - Candidate must have hands-on technical experience on the AWS cloud native technologies along with traditional ETL tools -Snowflake & DWH experience is a plus Daily Activity Excellent written and verbal communication skills and be able to lead meetings with technical peers and clients regarding the solution designs. Ability to communicate with the Business Analysts Data Modellers Cloud Architects Technical Developers Ability to lead Development teams Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade C Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility No Global Role Family 60236 (P) Software Engineering Local Role Name 6504 Developer / Software Engineer Local Skills 4979 PYTHON Languages RequiredEnglish Role Rarity To Be Defined
Posted 2 months ago
2 - 7 years
4 - 9 Lacs
Andhra Pradesh
Work from Office
Description Primary/Mandatory Experience - Experience building data pipeline in Python along with AWS services (S3 SNS CloudWatch Lambda Step Function etc) - Proficient in AWS Server less technologies - Technical knowledge of Extract/Transform/Load (ETL) solutions for running analytic projects on the cloud - Candidate must have hands-on technical experience on the AWS cloud native technologies along with traditional ETL tools -Snowflake & DWH experience is a plus Daily Activity Excellent written and verbal communication skills and be able to lead meetings with technical peers and clients regarding the solution designs. Ability to communicate with the Business Analysts Data Modellers Cloud Architects Technical Developers Ability to lead Development teams Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade C Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility Yes Global Role Family 60236 (P) Software Engineering Local Role Name 6504 Developer / Software Engineer Local Skills 4979 PYTHON Languages RequiredEnglish Role Rarity To Be Defined
Posted 2 months ago
2 - 7 years
4 - 9 Lacs
Andhra Pradesh
Work from Office
Description 1.Hands on industry experience in design and coding from scratch in AWS Glue-Pyspark with services like S3 DynamoDB StepFunctions etc. 2.Hands on industry experience in design and coding from scratch in Snowflake 3.Experience in Pyspark/Snowflake 1 to 3 years with overall around 5 years of experience in building data/analytics solutions Level Senior Consultant or below Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade C Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility Yes Global Role Family 60236 (P) Software Engineering Local Role Name 6361 Software Engineer Local Skills 59383 AWS Glue Languages RequiredEnglish Role Rarity To Be Defined
Posted 2 months ago
2 - 7 years
4 - 9 Lacs
Karnataka
Work from Office
Experienced data modelers, SQL, ETL, with some development background to provide defining new data schemas, data ingestion for Adobe Experience Platform customers. Interface directly with enterprise customers and collaborate with internal teams. 10+ years of strong experience with data transformation & ETL on large data sets/5+ years of Data Modelling experience (i.e., Relational, Dimensional, Columnar, Big Data) 5+ years of complex SQL or NoSQL experience Experience in advanced Data Warehouse concepts
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2