Home
Jobs
Companies
Resume

24 Matillion Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 20.0 years

10 - 40 Lacs

Pune, Delhi / NCR, Greater Noida

Work from Office

Naukri logo

Mandatory Skills - Snowflake, Matillion

Posted 1 week ago

Apply

6.0 - 11.0 years

6 - 11 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Experience in end-to-end data pipeline development/troubleshooting using Snowflake and Matillion Cloud. 5+ years of experience in DWH, 2-4 years of experience in implementing DWH on Snowflake using Matillion. Design, develop, and maintain ETL processes using Matillion to extract, transform, and load data into Snowflake and Develop and Debug ETL programs primarily using Matillion Cloud. Collaborate with data architects and business stakeholders to understand data requirements and translate them into technical solutions. We seek a skilled technical professional to lead the end-to-end system and architecture design for our application and infrastructure. Data validation & end to end testing of ETL Objects, Source data analysis and data profiling. Troubleshoot and resolve issues related to Matillion development and data integration. Collaborate with business users to create architecture in alignment with business need. Collaborate in Developing Project requirements for end-to-end Data integration process using ETL for Structured, semi-structured and Unstructured Data. Strong understanding of ELT/ETL and integration concepts and design best practices. Experience in performance tuning of the Matillion Cloud data pipelines and should be able to trouble shoot the issue quickly. Experience in Snowsql, Snow pipe will be added advantage.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

6 - 11 Lacs

Pune, Maharashtra, India

On-site

Foundit logo

Experience in end-to-end data pipeline development/troubleshooting using Snowflake and Matillion Cloud. 5+ years of experience in DWH, 2-4 years of experience in implementing DWH on Snowflake using Matillion. Design, develop, and maintain ETL processes using Matillion to extract, transform, and load data into Snowflake and Develop and Debug ETL programs primarily using Matillion Cloud. Collaborate with data architects and business stakeholders to understand data requirements and translate them into technical solutions. We seek a skilled technical professional to lead the end-to-end system and architecture design for our application and infrastructure. Data validation & end to end testing of ETL Objects, Source data analysis and data profiling. Troubleshoot and resolve issues related to Matillion development and data integration. Collaborate with business users to create architecture in alignment with business need. Collaborate in Developing Project requirements for end-to-end Data integration process using ETL for Structured, semi-structured and Unstructured Data. Strong understanding of ELT/ETL and integration concepts and design best practices. Experience in performance tuning of the Matillion Cloud data pipelines and should be able to trouble shoot the issue quickly. Experience in Snowsql, Snow pipe will be added advantage.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

6 - 11 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Foundit logo

Experience in end-to-end data pipeline development/troubleshooting using Snowflake and Matillion Cloud. 5+ years of experience in DWH, 2-4 years of experience in implementing DWH on Snowflake using Matillion. Design, develop, and maintain ETL processes using Matillion to extract, transform, and load data into Snowflake and Develop and Debug ETL programs primarily using Matillion Cloud. Collaborate with data architects and business stakeholders to understand data requirements and translate them into technical solutions. We seek a skilled technical professional to lead the end-to-end system and architecture design for our application and infrastructure. Data validation & end to end testing of ETL Objects, Source data analysis and data profiling. Troubleshoot and resolve issues related to Matillion development and data integration. Collaborate with business users to create architecture in alignment with business need. Collaborate in Developing Project requirements for end-to-end Data integration process using ETL for Structured, semi-structured and Unstructured Data. Strong understanding of ELT/ETL and integration concepts and design best practices. Experience in performance tuning of the Matillion Cloud data pipelines and should be able to trouble shoot the issue quickly. Experience in Snowsql, Snow pipe will be added advantage.

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Consultant - Sr.Data Engineer ( DBT+Snowflake ) ! In this role, the Sr.Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Develop, implement, and optimize data pipelines using Snowflake, with a focus on Cortex AI capabilities. Extract, transform, and load (ETL) data from various sources into Snowflake, ensuring data integrity and accuracy. Implement Conversational AI solutions using Snowflake Cortex AI to facilitate data interaction through ChatBot agents. Collaborate with data scientists and AI developers to integrate predictive analytics and AI models into data workflows. Monitor and troubleshoot data pipelines to resolve data discrepancies and optimize performance. Utilize Snowflake%27s advanced features, including Snowpark, Streams, and Tasks, to enable data processing and analysis. Develop and maintain data documentation, best practices, and data governance protocols. Ensure data security, privacy, and compliance with organizational and regulatory guidelines. Responsibilities: . Bachelor&rsquos degree in Computer Science, Data Engineering, or a related field. . experience in data engineering, with experience working with Snowflake. . Proven experience in Snowflake Cortex AI, focusing on data extraction, chatbot development, and Conversational AI. . Strong proficiency in SQL, Python, and data modeling . . Experience with data integration tools (e.g., Matillion , Talend, Informatica). . Knowledge of cloud platforms such as AWS, Azure, or GCP. . Excellent problem-solving skills, with a focus on data quality and performance optimization. . Strong communication skills and the ability to work effectively in a cross-functional team. Proficiency in using DBT%27s testing and documentation features to ensure the accuracy and reliability of data transformations. Understanding of data lineage and metadata management concepts, and ability to track and document data transformations using DBT%27s lineage capabilities. Understanding of software engineering best practices and ability to apply these principles to DBT development, including version control, code reviews, and automated testing. Should have experience building data ingestion pipeline. Should have experience with Snowflake utilities such as SnowSQL , SnowPipe , bulk copy, Snowpark, tables, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight . Should have good experience in implementing CDC or SCD type 2 Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in repository tools like Github /Gitlab, Azure repo Qualifications/Minimum qualifications B.E./ Masters in Computer Science , Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant of working experience as a Sr. Data Engineer with DBT+Snowflake skillsets Skill Matrix: DBT (Core or Cloud), Snowflake, AWS/Azure, SQL, ETL concepts, Airflow or any orchestration tools, Data Warehousing concepts Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

20 - 35 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Experience in end-to-end data pipeline development/troubleshooting using Snowflake and Matillion Cloud. 5+ years of experience in DWH, 2-4 years of experience in implementing DWH on Snowflake using Matillion. Design, develop, and maintain ETL processes using Matillion to extract, transform, and load data into Snowflake and Develop and Debug ETL programs primarily using Matillion Cloud. Collaborate with data architects and business stakeholders to understand data requirements and translate them into technical solutions. We seek a skilled technical professional to lead the end-to-end system and architecture design for our application and infrastructure. Data validation & end to end testing of ETL Objects, Source data analysis and data profiling. Troubleshoot and resolve issues related to Matillion development and data integration. Collaborate with business users to create architecture in alignment with business need. Collaborate in Developing Project requirements for end-to-end Data integration process using ETL for Structured, semi-structured and Unstructured Data. Strong understanding of ELT/ETL and integration concepts and design best practices. Experience in performance tuning of the Matillion Cloud data pipelines and should be able to trouble shoot the issue quickly. Experience in Snowsql, Snow pipe will be added advantage.

Posted 3 weeks ago

Apply

5 - 10 years

0 - 1 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

JD for Snowflake Admin Key Responsibilities: Administer and manage Snowflake environments including user roles, access control, and resource monitoring. Develop, test, and deploy ELT/ETL pipelines using Snowflake SQL and other tools (e.g., Informatica, DBT, Matillion). Monitor query performance and storage utilization; implement performance tuning and optimization strategies. Manage and automate tasks such as warehouse scaling, Snowpipe ingestion, and task scheduling. Work with semi-structured data formats (JSON, XML, Avro, Parquet) using VARIANT and related functions. Set up and manage data sharing, replication, and failover across Snowflake accounts. Implement and manage security best practices including RBAC, masking policies, and object-level permissions. Collaborate with Data Engineers, Architects, and BI teams to support analytics use cases. Required Skills: Strong hands-on experience with Snowflake architecture, SQL, and performance tuning. Experience with Snowflake features such as Streams, Tasks, Time Travel, Cloning, and External Tables. Proficiency in working with SnowSQL and managing CLI-based operations. Knowledge of cloud platforms (AWS / Azure / GCP) and integration with Snowflake. Experience with data ingestion tools and scripting languages (Python, Shell, etc.). Good understanding of CI/CD pipelines and version control (Git). Role & responsibilities Preferred candidate profile

Posted 1 month ago

Apply

6 - 10 years

8 - 18 Lacs

Kolhapur, Hyderabad, Chennai

Work from Office

Naukri logo

Relevant Exp:5+ Yrs Mandatory Skills: Snowflake architecture, Matillion, SQL, Python, SnowSQL, any cloud Exp Night shift (6 PM to 3 AM) Complete WFO - 5 Days Email Id: anusha@akshayaitsolutions.com Loc: Hyd/ Ban/Chennai/Kolhapur

Posted 1 month ago

Apply

8 - 13 years

10 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com with the below details ASAP. C.CTC- E.CTC- Notice Period- Current location- Are you serving Notice period/immediate- Exp in Snowflake- Exp in Matillion- 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Only Serving/ Immediate candidates can apply. Interview Process-1 Round(Virtual)+Final Round(F2F) Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse,Python, Matillion, AWS S3, EC2 Preferred skills : SSIR, SSIS, Informatica, Shell Scripting Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com

Posted 1 month ago

Apply

3 - 8 years

10 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com with the below details ASAP. C.CTC- E.CTC- Notice Period- Current location- Are you serving Notice period/immediate- Exp in Snowflake- Exp in Matillion- 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Only Serving/ Immediate candidates can apply. Interview Process-2 Rounds(Virtual)+Final Round(F2F) Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse,Python, Matillion, AWS S3, EC2 Preferred skills : SSIR, SSIS, Informatica, Shell Scripting Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com

Posted 1 month ago

Apply

5 - 10 years

10 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com with the below details ASAP. C.CTC- E.CTC- Notice Period- Current location- Are you serving Notice period/immediate- Exp in Snowflake- Exp in Matillion- 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Only Serving/ Immediate candidates can apply. Interview Process-2 Rounds(Virtual)+Final Round(F2F) Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse,Python, Matillion, AWS S3, EC2 Preferred skills : SSIR, SSIS, Informatica, Shell Scripting Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com

Posted 2 months ago

Apply

7 - 9 years

15 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

Data Solution Design and Development: o Architect, design, and implement end-to-end data pipelines and systems that handle large-scale, complex datasets. o Ensure optimal system architecture for performance, scalability, and reliability. o Evaluate and integrate new technologies to enhance existing solutions. o Implement best practices in ETL/ELT processes, data integration, and data warehousing. 2. Project Leadership and Delivery: o Lead technical project execution, ensuring timelines and deliverables are met with high quality. o Collaborate with cross-functional teams to align business goals with technical solutions. o Act as the primary point of contact for clients, translating business requirements into actionable technical strategies. 3. Team Leadership and Development: o Manage, mentor, and grow a team of 5 to 7 data engineers. o Conduct code reviews, validations, and provide feedback to ensure adherence to technical standards. o Provide technical guidance and foster an environment of continuous learning, innovation, and collaboration. 4. Optimization and Performance Tuning: o Analyze and optimize existing data workflows for performance and cost-efficiency. o Troubleshoot and resolve complex technical issues within data systems. 5. Adaptability and Innovation: o Embrace a consulting mindset with the ability to quickly learn and adopt new tools, technologies, and frameworks. o Identify opportunities for innovation and implement cutting-edge technologies in data engineering. o Exhibit a "figure it out" attitude, taking ownership and accountability for challenges and solutions. 6. Client Collaboration: o Engage with stakeholders to understand requirements and ensure alignment throughout the project lifecycle. o Present technical concepts and designs to both technical and non-technical audiences. o Communicate effectively with stakeholders to ensure alignment on project goals, timelines, and deliverables. 7. Learning and Adaptability: o Stay updated with emerging data technologies, frameworks, and tools. o Actively explore and integrate new technologies to improve existing workflows and solutions. 8. Internal Initiatives and Eminence Building: o Drive internal initiatives to improve processes, frameworks, and methodologies. o Contribute to the organizations eminence by developing thought leadership, sharing best practices, and participating in knowledge-sharing activities. Qualifications Education: o Bachelor’s or Master’s degree in computer science, Data Engineering, or a related field. o Certifications in cloud platforms such as Snowflake Snowpro, Azure Data Engineer is a plus. Experience: o 7 to 10 years of experience in data engineering with hands-on expertise in data pipeline development, architecture, and system optimization. o Proven track record in leading data engineering teams and managing end-to-end project delivery. Technical Skills: o Expertise in programming languages such as Python, Scala, or Java. o Proficiency in designing and delivering data pipelines in Cloud Data Warehouses (e.g., Snowflake, Redshift), using various ETL/ELT tools such as Matillion, dbt, Striim, etc. o Solid understanding of database systems (relational and NoSQL) and data modeling techniques. o Hands-on experience of 2+ years in designing and developing data integration solutions using Matillion and/or dbt. o Strong knowledge of data engineering and integration frameworks. o Expertise in architecting data solutions. o Successfully implemented at least two end-to-end projects with multiple transformation layers. o Good grasp of coding standards, with the ability to define standards and testing strategies for projects. o Proficiency in working with cloud platforms (AWS, Azure, GCP) and associated data services. o Enthusiastic about working in Agile methodology. o Possess a comprehensive understanding of the DevOps process including GitHub integration and CI/CD pipelines. o Experience working with containerization (Docker), and orchestration tools (such as Airflow, Control-M). Soft Skills: o Exceptional problem-solving and analytical skills. o Strong communication and interpersonal skills to manage client relationships and team dynamics. o Ability to thrive in a consulting environment, quickly adapting to new challenges and domains. o

Posted 2 months ago

Apply

10 - 16 years

20 - 35 Lacs

Pune, Bengaluru, Mohali

Hybrid

Naukri logo

Designation: Lead Data Engineer Experience: 10 to 16 Years Location: Bengaluru/Mohali/Gurugram/Pune. Hybrid Model. Notice Period: up to 60 days Mandatory skills:Snowflake-3+ Years, Python-5+ Years, Matillion-2+ Years, SQL-5+ Years, Tableau-3+ Years

Posted 2 months ago

Apply

5 - 10 years

13 - 23 Lacs

Bengaluru

Work from Office

Naukri logo

Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com with the below details ASAP. C.CTC- E.CTC- Notice Period- Current location- Are you serving Notice period/immediate- Exp in Snowflake- Exp in Matillion- 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Only Serving/ Immediate candidates can apply. Interview Process-2 Rounds(Virtual)+Final Round(F2F) Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse,Python, Matillion, AWS S3, EC2 Preferred skills : SSIR, SSIS, Informatica, Shell Scripting Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com

Posted 2 months ago

Apply

5 - 10 years

20 - 25 Lacs

Hyderabad

Work from Office

Naukri logo

Snowflake Matillion Developer _ Reputed US based IT MNC If you are a Snowflake Matillion Developer, Email your CV to jagannaath@kamms.net Whatsapp your CV to 70926 89999 Experience : 5 Years + (Must be 100% real time experience can apply) Role : Snowflake Matillion Developer Preferred : Snowflake certifications (SnowPro Core/Advanced) Position Type: Full time/ Permanent Location : Hyderabad ( Work from Office) Notice Period: Immediate to 15 Days Working hours : 04:30PM 12:30AM Salary: As per your experience Interviews: 1. Technical online 2. Final face to face Responsibilities: 5+ years of experience in data engineering, ETL, and Snowflake development. Strong expertise in Snowflake SQL scripting, performance tuning, data warehousing concepts. Hands-on experience with Matillion ETL – building & maintaining Matillion jobs. Strong knowledge of cloud platforms (AWS/Azure/GCP) and cloud-based data architecture. Proficiency in SQL, Python, or scripting languages for automation & transformation. Experience with API integrations & data ingestion frameworks. Understanding of data governance, security policies, and access control in Snowflake. Excellent communication skills – ability to interact with business and technical stakeholders. Self-starter who can work independently and drive projects to completion

Posted 2 months ago

Apply

0 - 5 years

10 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com with the below details ASAP. C.CTC- E.CTC- Notice Period- Current location- Are you serving Notice period/immediate- Exp in Snowflake- Exp in Matillion- 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Only Serving/ Immediate candidates can apply. Interview Process-2 Rounds(Virtual)+Final Round(F2F) Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse,Python, Matillion, AWS S3, EC2 Preferred skills : SSIR, SSIS, Informatica, Shell Scripting Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com

Posted 3 months ago

Apply

5 - 10 years

20 - 25 Lacs

Hyderabad

Work from Office

Naukri logo

Tableau Developer for Reputed US IT MNC, Hyderabad If you are highly experience with 6 years of hands on experience on Tableau, Please share your CV to jagannaath@kamms.net/ Whatsapp to 7092689999 Position Title: Tableau Developer Position Type: Permanent Job Location: Madhapur, Hyderabad Mode: office ( not negotiable) Timings: 4.30 PM to 12.30 AM (US Shift - again not negotiable) JOB BRIEF: As a Tableau Developer, your role will be to design, construct, and enhance the organization's data visualization tools using Tableau software. You'll collaborate closely with business analysts, data engineers, and data scientists to comprehend requirements and translate them into visually pleasing dashboards and reports. Roles and Responsibilities Create interactive dashboards and reports in Tableau Desktop and Tableau Server. Verify data accuracy and consistency by working closely with data engineers and analysts. Collaboration with business stakeholders to comprehend their data requirements and offer solutions that meet them. Diagnose and resolve any problems related to data, dashboards, or reports. Maintain and enhance existing Tableau dashboards and reports. Create technical designs, test cases, and user guides. On Tableau Desktop, create workbooks and hyper extracts. Attach Workbooks and Data Sources to Tableau Development and Quality Assurance Sites. Update Extracts on QA and Development Sites. Keep track of the extract schedules on the QA site. Run unit tests on QA and development sites. Address any performance or data issues with workbooks and data sources. Use Collibra to catalogue measurements and dimensions. Skills and Requirements Advanced proficiency with Tableau Desktop and Server. Excellent knowledge of database concepts and extensive hands-on experience working with SQL. Comprehensive understanding of data visualization best practices Experience with data analysis, modeling, and ETL processes is advantageous. Familiarity with data warehouse and integration tools. Graduate/Post-Graduate degree in Computer Science via reputed institute

Posted 3 months ago

Apply

4 - 7 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Strong Knowledge in SQL. Strong programming skills in Python. Strong problem-solving skills. Strong communication skills. Ability to learn new technical skills. Primary Skills SQL, AWS, Python Secondary Skills Agile, Matillion, Redshift Required Candidate profile 4-7 yrs of overall exp. 4+ yrs of relevant exp SQL, AWS, Python. Working experience in Agile methodologies. Strong knowledge and experience in AWS technologies. Experience in Matillion and Redshift.

Posted 3 months ago

Apply

5 - 7 years

8 - 10 Lacs

Bengaluru

Hybrid

Naukri logo

Contract Duration: 12 months Experience Level: 5+ years of experience in Design, develop, document, test, and debug new and existing software systems . Snowflake SQL Writing SQL queries against Snowflake Developing scripts Unix, Python Required Candidate profile , Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. . In-depth understanding of Data Warehouse/ODS, ETL concept and modeling structure principles

Posted 3 months ago

Apply

4 - 7 years

6 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

4-7 years of overall experience. 4+ years of relevant experience SQL, AWS, Python. Working experience in Agile methodologies. Strong knowledge and experience in AWS technologies. Required Candidate profile Experience in Matillion and Redshift. Strong Knowledge in SQL. Strong programming skills in Python. Strong problem-solving skills. Strong communication skills.

Posted 3 months ago

Apply

14 - 18 years

16 - 25 Lacs

Chennai, Bengaluru, Noida

Hybrid

Naukri logo

Job Description - 12+ Years of experience in managing delivery of Data Warehouse Projects (Development & Modernization/Migration). Strong Delivery background with experience in managing large complex Data Warehouse engagements. Good to have experience on Snowflake, Matillion, DBT, Netezza/DataStage and Oracle. Healthcare Payer Industry experience Extensive experience in Program/Project Management, Iterative, Waterfall and Agile Methodologies. Ability to track and manage complex program budgets Experience in managing the delivery of complex programs to meet the needs and the required timelines set for the defined programs. Communicate program review results to various stakeholders. Experience in building the team, providing guidance, and education as needed to ensure the success of priority programs and promote cross-training within the department. Experience in developing and managing an integrated program plans that incorporate both technical and business deliverables. Verify that critical decision gates are well defined, communicated and monitored for executive approval throughout the program. Verify that work supports the corporate strategic direction. Review resulting vendor proposals and estimates to ensure they satisfy both our functional requirements and technology strategies. Project management methodologies, processes, and tools. Knowledge of Project Development Life Cycle Establish and maintain strong working relationships with various stakeholders including team members, IT resources, resources in other areas of the business and upper management Ability to track and manage complex program budgets Strong business acumen and political savvy Ability to collaborate while dealing with complex situations Ability to think creatively and to drive innovation Ability to motivate, lead and inspire a diverse group to a common goal/solution with multiple stakeholders Ability to convert business strategy into action oriented objectives and measurable results Strong negotiating, influencing, and consensus-building skills Ability to mentor, coach and provide guidance to others Responsibilities Responsible for the end to end delivery of the Application Development and Support services for the client Coordinate with Enterprise Program Management Office to execute programs following defined standards and governance structure to ensure alignment to the approved project development life cycle (PDLC). Interface regularly with key senior business leaders to enable a smooth transition from strategy development to program identification and execution. Facilitate meetings with task groups or functional areas as required for EPMO supported initiatives and/or to resolve issues. Proactively engage other members of the organization with specific subject knowledge to resolve issues or provide assistance. Lead post implementation review of major initiatives to provide lessons learned and continuous improvement. Develop accurate and timely summary report for executive management that provide consolidated, clear, and concise assessments of strategic initiatives implementation status. Collaborate with business owners to develop divisional business plans that support the overall strategic direction. Supports budget allocation process through ongoing financial tracking reports. Develop & maintain service plans considering the customer requirements. Track and monitor to ensure the adherence to SLA/KPIs Identify opportunities for improvement to service delivery process. Address service delivery issues/escalations/complaints. First point of escalation for customer escalations Oversee shift management for various tracks. Responsible for publishing production support reports & metrics Regards, Sanjay Kumar

Posted 3 months ago

Apply

7 - 9 years

12 - 17 Lacs

Bengaluru

Hybrid

Naukri logo

Job Summary The Developer role is responsible for developing and supporting integration and automation solutions using Informatica, Unix scripting, PowerCenter mappings and workflows, IDQ mappings and mapplets, UDO transformations, and creation of complex XSDs. Serve as Informatica subject matter expert for team members and clients. Responsibilities Responsible for establishing, developing, updating internal Enterprise software development standards and best practices for ETL development using Informatica and extension (including but not limited to Power Center mappings & workflow, IDQ mappings & mapplets, UDO transformations, creation of complex XSDs), Matillion, Snowflake, UNIX, Java, Shell scripting, Salesforce, Azure and Tableau) Analyze, plan, design, develop, test and implement innovative methods for advancing ETL processes. Responsible for partnering with Technical Manager to follow established Enterprise Software Development Life Cycle (SDLC) standards and maintaining the successful adherence to these standards across entire Technology Solutions team Monitors for technology advancements impacting Integration Center related tools (Informatica, Tableau, and Azure, etc.) and make recommendations to drive efficiencies and ROI. Utilize skills of communication, presentation, time management, organization and planning to successfully achieve team goals and objectives Responsible for updating Integration Center documentation and best practices library, ensuring this is kept up-to-date for reference and training material Qualifications, Skills, and Experience Minimum Bachelors degree in Computer Science or Information Systems 7 + years of software development experience 7+ years of experience using a relational database 7 +years of ETL experience (analyze, plan, design, develop, test, implement, and change) Expert-level experience with Oracle SQL, PL/SQL, Informatica Stack (including but not limited to Power Center mappings & workflow, IDQ mappings & mapplets, UDO transformations, creation of complex XSD), UNIX, Java, Shell scripting, Salesforce, and Tableau 7+ years of experience in data analysis and troubleshooting Hands-on experience with Snowflake databases and creating ELT jobs in Matillion Ability to a manage a team with a high-volume workload, providing varying levels of support when required. Expert knowledge of MS Office tools, including MS Excel, and MS Access Experience with data warehouses and sound understanding of data warehousing principles VBA development experience in Excel a plus Must have strong technical, organizational and communication skills (both written and verbal) Must have robust, problem-solving capabilities, strong analytical skills, be flexible and able to handle multiple tasks concurrently Aptitude for learning new technologies and learning on the fly

Posted 3 months ago

Apply

7 - 12 years

18 - 30 Lacs

Pune, Delhi NCR, Bengaluru

Hybrid

Naukri logo

Role & responsibilities Design, develop, and maintain robust and scalable data pipelines using modern ETL/ELT tools and techniques. Implement and manage data orchestration tools such as DBT, Fivetran, Stitch, or Matillion. Build and optimize data models for various analytical and reporting needs. Ensure data quality and integrity through rigorous testing and validation. Monitor and troubleshoot data pipelines and infrastructure, proactively identifying and resolving issues. Collaborate with data scientists and analysts to understand their data requirements and provide support. Stay up-to-date with the latest data engineering trends and technologies. Contribute to the development and improvement of our data engineering best practices. Mentor junior data engineers and provide technical guidance. Participate in code reviews and contribute to a collaborative development environment. Document data pipelines and infrastructure for maintainability and knowledge sharing. Contribute to the architecture and design of our overall data platform. Qualifications: Bachelor's degree in Computer Science, Engineering, or a related field. 7+ years of proven experience as a Data Engineer, preferably in a fast-paced environment. Deep understanding of data warehousing concepts and best practices. Hands-on experience with at least one data orchestration tool (DBT, Fivetran, Stitch, Matillion). Proficiency in SQL and extensive experience with data modeling. Experience with cloud-based data warehousing solutions (e.g., Snowflake, BigQuery, Redshift). Experience with programming languages like Python or Scala is highly preferred.

Posted 3 months ago

Apply

5 - 10 years

16 - 31 Lacs

Chennai, Pune, Bengaluru

Work from Office

Naukri logo

Hi Connections, I am looking for Matillion Lead for one of our MNC client . Exp- 5+ yrs. Please email your resumes to parul@mounttalent.com. Skills Required: - Matillion - Python - SQL Location: Pune, Mumbai, Noida, Chennai, Bangalore, Hyderabad

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies