Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
0 Lacs
nagpur, maharashtra
On-site
As an ETL Developer with 4 to 8 years of experience, you will be responsible for hands-on ETL development using the Talend tool. You should have a high level of proficiency in writing complex yet efficient SQL queries. Your role will involve working extensively on PL/SQL Packages, Procedures, Functions, Triggers, Views, MViews, External tables, partitions, and Exception handling for retrieving, manipulating, checking, and migrating complex data sets in Oracle. In this position, it is essential to have experience in Data Modeling and Warehousing concepts such as Star Schema, OLAP, OLTP, Snowflake schema, Fact Tables for Measurements, and Dimension Tables. Additionally, familiarity with UNIX Scripting, Python/Spark, and Big Data Concepts will be beneficial. If you are a detail-oriented individual with strong expertise in ETL development, SQL, PL/SQL, data modeling, and warehousing concepts, this role offers an exciting opportunity to work with cutting-edge technologies and contribute to the success of the organization.,
Posted 3 days ago
8.0 - 12.0 years
0 Lacs
punjab
On-site
You are an experienced and results-driven ETL & DWH Engineer/Data Analyst with over 8 years of expertise in data integration, warehousing, and analytics. Your role involves having deep technical knowledge in ETL tools, strong data modeling skills, and the capability to lead intricate data engineering projects from inception to implementation. Your key skills include: - Utilizing ETL tools such as SSIS, Informatica, DataStage, or Talend for more than 4 years. - Proficiency in relational databases like SQL Server and MySQL. - Comprehensive understanding of Data Mart/EDW methodologies. - Designing star schemas, snowflake schemas, fact and dimension tables. - Experience with Snowflake or BigQuery. - Familiarity with reporting and analytics tools like Tableau and Power BI. - Proficient in scripting and programming using Python. - Knowledge of cloud platforms like AWS or Azure. - Leading recruitment, estimation, and project execution. - Exposure to Sales and Marketing data domains. - Working with cross-functional and geographically distributed teams. - Translating complex data issues into actionable insights. - Strong communication and client management abilities. - Initiative-driven with a collaborative approach and problem-solving mindset. Your roles & responsibilities will include: - Creating high-level and low-level design documents for middleware and ETL architecture. - Designing and reviewing data integration components while ensuring compliance with standards and best practices. - Ensuring delivery quality and timeliness for one or more complex projects. - Providing functional and non-functional assessments for global data implementations. - Offering technical guidance and support to junior team members for problem-solving. - Leading QA processes for deliverables and validating progress against project timelines. - Managing issue escalation, status tracking, and continuous improvement initiatives. - Supporting planning, estimation, and resourcing for data engineering efforts.,
Posted 6 days ago
4.0 - 8.0 years
8 - 12 Lacs
Bengaluru
Hybrid
POWER BI/SQL EXPERT - Relevant - 4 to 6 years Key Responsibilities & Must Skills Required: Power BI Expertise: Strong hands-on experience with Power BI using Import Mode and Direct Query. Skilled in DAX, report/dashboard creation, data modeling, and building dimension and fact tables. Proven ability to optimize and manage large datasets for performance in Power BI. Overall, 7-8 years of experience, with 45 years specifically in Power BI. SQL & Database Skills: Proficient in SQL, with secondary expertise in Oracle SQL. Ability to manipulate data, apply filters, and extract meaningful insights. Communication & Stakeholder Management: Strong communication skills required to coordinate directly with the Product Owner Comfortable working in client-facing roles within the Accounts domain.
Posted 2 weeks ago
3.0 - 8.0 years
0 - 3 Lacs
Bengaluru
Remote
If you are passionate about Snowflake, data warehousing, and cloud-based analytics, we'd love to hear from you! Apply now to be a part of our growing team. Perks and benefits Intersected candidates can go through the below link to apply directly and can complete the 1st round of technical discussion https://app.hyrgpt.com/candidate-job-details?jobId=67ecc88dda1154001cc8b88f Job Summary: We are looking for a skilled Snowflake Engineer with 3-10 years of experience in designing and implementing cloud-based data warehousing solutions. The ideal candidate will have hands-on expertise in Snowflake architecture, SQL, ETL pipeline development, and performance optimization. This role requires proficiency in handling structured and semi-structured data, data modeling, and query optimization to support business intelligence and analytics initiatives. The ideal candidate will work on a project for one of our key Big4 consulting customer and will have immense learning opportunities Key Responsibilities: Design, develop, and manage high-performance data pipelines for ingestion, transformation, and storage in Snowflake. Optimize Snowflake workloads, ensuring efficient query execution and cost management. Develop and maintain ETL processes using SQL, Python, and orchestration tools. Implement data governance, security, and access control best practices within Snowflake. Work with structured and semi-structured data formats such as JSON, Parquet, Avro, and XML. Design and maintain fact and dimension tables, ensuring efficient data warehousing and reporting. Collaborate with data analysts and business teams to support reporting, analytics, and business intelligence needs. Troubleshoot and resolve data pipeline issues, ensuring high availability and reliability. Monitor and optimize Snowflake storage and compute usage to improve efficiency and performance. Required Skills & Qualifications: 3-10 years of experience in Snowflake, SQL, and data engineering. Strong hands-on expertise in Snowflake development, including data sharing, cloning, and time travel. Proficiency in SQL scripting for query optimization and performance tuning. Experience with ETL tools and frameworks (e.g., DBT, Airflow, Matillion, Talend). Familiarity with cloud platforms (AWS, Azure, or GCP) and integration with Snowflake. Strong understanding of data warehousing concepts, including fact and dimension modeling. Ability to work with semi-structured data formats like JSON, Avro, Parquet, and XML. Knowledge of data security, governance, and access control within Snowflake. Excellent problem-solving and troubleshooting skills. Preferred Qualifications: Experience in Python for data engineering tasks. Familiarity with CI/CD pipelines for Snowflake development and deployment. Exposure to streaming data ingestion and real-time processing. Experience with BI tools such as Tableau, Looker, or Power BI.
Posted 1 month ago
7.0 - 8.0 years
9 - 10 Lacs
Mumbai
Work from Office
This Role Includes. Leading RPA development team with a mandate for new automation opportunity finding, requirement analysis, and Shaping the solution approach for Business process transformation using RPA. Responsible for leading design, development and deployment of RPA bots for different clients. Supporting different teams for solution life cycle management on-going operational support, Process change activities etc Assist and drive the team by providing oversight and as a mentor.. Requirements. Hands-on experience in working with RE Framework. Hands-on experience in working with Data tables, argument and variables. Hands-on experience in working with selectors. Understanding of PDF automation. Hands-on experience in working and creation of Libraries. Hands-on experience in debugging, breakpoints and watch points. Understanding of Orchestrator and deployment process. Hands-on experience in error and exception handling. Analysis of business requirement and effort estimation.. UiPath Developer Certification. Understanding of Abbyy Integration. Experience in .Net language. Understanding of Machine Learning with Python programming. Hands-on experience in PDF automation. Strong working knowledge of SQL and relational databases. Experience in Citrix automation. Experience in using Regex. Understanding of integration with APIs. Experience in image automation. Experience in document understanding. Understanding of machine learning models and its capabilities in UiPaths. Experience/skills Required. Overall 7 8 years of experience with minimum 4-5 years exp in RPA (preferably using UiPath). (ref:hirist.tech).
Posted 1 month ago
8.0 - 11.0 years
35 - 37 Lacs
Kolkata, Ahmedabad, Bengaluru
Work from Office
Dear Candidate, We are hiring a Data Warehouse Architect to design scalable, high-performance data warehouse solutions for analytics and reporting. Perfect for engineers experienced with large-scale data systems. Key Responsibilities: Design and maintain enterprise data warehouse architecture Optimize ETL/ELT pipelines and data modeling (star/snowflake schemas) Ensure data quality, security, and performance Work with BI teams to support analytics and reporting needs Required Skills & Qualifications: Proficiency with SQL and data warehousing tools (Snowflake, Redshift, BigQuery, etc.) Experience with ETL frameworks (Informatica, Apache NiFi, dbt, etc.) Strong understanding of dimensional modeling and OLAP Bonus: Knowledge of cloud data platforms and orchestration tools (Airflow) Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 1 month ago
2.0 - 5.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Genesys Cloud CX Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Genesys Cloud CX.- Good To Have Skills: Experience with cloud-based application development.- Must have Genesys Architect & Developer with Data actions & integration experience.- Must have call Flow build experience with Data Tables, Call routing & Data actions in Genesys Experience.- Must have Genesys IVR development experience with Google dialogue or Genesys Flow engine bots.- Must be familiar with Genesys APl's and data.- Must be familiar with Genesys L3 support cases and escalation(s)- Strong understanding of application lifecycle management.- Familiarity with integration techniques for various business applications.- Experience in troubleshooting and optimizing application performance. Additional Information:- Must have 6+ years of working experience in Genesys with a minimum of 1-2 years in Genesys Cloud.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 months ago
9 - 11 years
37 - 40 Lacs
Ahmedabad, Bengaluru, Mumbai (All Areas)
Work from Office
Dear Candidate, We are hiring a Data Engineer to build scalable data pipelines and infrastructure to power analytics and machine learning. Ideal for those passionate about data integrity, automation, and performance. Key Responsibilities: Design ETL/ELT pipelines using tools like Airflow or dbt Build data lakes and warehouses (BigQuery, Redshift, Snowflake) Automate data quality checks and monitoring Collaborate with analysts, data scientists, and backend teams Optimize data flows for performance and cost Required Skills & Qualifications: Proficiency in SQL, Python, and distributed systems (e.g., Spark) Experience with cloud data platforms (AWS, GCP, or Azure) Strong understanding of data modeling and warehousing principles Bonus: Experience with Kafka, Parquet/Avro, or real-time streaming Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 2 months ago
5 - 8 years
9 - 13 Lacs
Mumbai
Work from Office
This Role Includes Leading RPA development team with a mandate for new automation opportunity finding, requirement analysis, and Shaping the solution approach for Business process transformation using RPA Responsible for leading design, development and deployment of RPA bots for different clients Supporting different teams for solution life cycle management - on-going operational support, Process change activities etc Assist and drive the team by providing oversight and as a mentor. Requirements Hands-on experience in working with RE Framework Hands-on experience in working with Data tables, argument and variables Hands-on experience in working with selectors Understanding of PDF automation Hands-on experience in working and creation of Libraries Hands-on experience in debugging, breakpoints and watch points Understanding of Orchestrator and deployment process Hands-on experience in error and exception handling Analysis of business requirement and effort estimation. UiPath Developer Certification Understanding of Abbyy Integration Experience in .Net language Understanding of Machine Learning with Python programming Hands-on experience in PDF automation Strong working knowledge of SQL and relational databases Experience in Citrix automation Experience in using Regex Job Description Understanding of integration with APIs Experience in image automation Experience in document understanding Understanding of machine learning models and its capabilities in UiPaths Experience/skills required: Overall 7-8 years of experience with minimum 4-5 years exp in RPA (preferably using UiPath
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough