Home
Jobs

836 Talend Jobs - Page 30

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2 - 7 years

4 - 9 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : NA Minimum 2 year(s) of experience is required Educational Qualification : Must be graduate Summary :As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems using Google BigQuery. Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing using Google BigQuery. Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems. Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs. Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency. Implement data security and privacy measures to protect sensitive data. Professional & Technical Skills: Must To Have Skills:Proficiency in Google BigQuery. Good To Have Skills:Experience with ETL tools such as Apache NiFi or Talend. Strong understanding of data modeling, data warehousing, and data integration concepts. Experience with SQL and NoSQL databases. Familiarity with cloud computing platforms such as Google Cloud Platform or AWS. Experience with data security and privacy measures. Additional Information: The candidate should have a minimum of 2 years of experience in Google BigQuery. The ideal candidate will possess a strong educational background in computer science, information technology, or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Bengaluru office. Qualifications Must be graduate

Posted 3 months ago

Apply

6 - 8 years

8 - 12 Lacs

Chennai, Hyderabad

Work from Office

Naukri logo

What youll be doing... In this role as a Sr Engr in the Tech Strategy & Planning team for TSGI-Cloud and Enterprise Architecture team, you'll be managing multiple programs designed to help GTS move towards its strategic objectives. You will be a key contributor in planning and building Data, Platforms and Emerging Technology North star, and ensuring the smooth and successful execution of initiatives by coordinating with cross-functional teams. Your expertise will help us solve complex problems and find unique solutions to optimize the technology landscape of our organization. Your responsibilities include but are not limited to: Technology Road-mapping: Develop and maintain a technology roadmap that aligns with the organization's strategic goals and evolving industry trends. Strategic Planning: Collaborate with leadership to identify opportunities where technology can be a competitive advantage and contribute to the development of the company's overall strategic plan. Innovation and Research: Stay updated on emerging technologies, assess their relevance to the business, and propose innovative solutions. Vendor and Partner Management: Evaluate and manage relationships with technology vendors and partners to ensure they support the organization's strategic objectives. Evaluate the Gen AI tools in partnership with AI&D team Develop scalable prototypes of LLM and NLP Modules and systems which are critical to the companys product lines. Apply state-of-the-art LLM techniques to understand a large amount of unstructured data and translate them to a meaningful and structured data Design, develop and evaluate predictive LLM models that are on par with industry standards & define metrics that measure success and customer value delivery Work closely with process experts to analyze and design solutions based on business requirements. What were looking for... Youre analytical and great at quickly grasping challenging concepts. As a strong written and verbal communicator, you deliver complex messages vividly to technical and business audiences alike. Youre no stranger to a fast-paced environment and tight deadlines, and you adapt to changing priorities and balance multiple projects with ease. You take pride in your work and get a lot of satisfaction from meeting and exceeding the expectations of your customers Youll need to have: Bachelors degree or four or more years of experience. Minimum 6 years of experience in one or more of Data Science, LLM, Gen AI Established experience delivering information management solutions to large numbers of end users Experience in building LLM solutions to business problems across support, sales, digital, chat, voice etc., Familiarity with Gen AI Models (OpenAI, Gemini AI, Vertex.AI, Mistral, LlaMa, etc.,) and finetune based on domain specific needs Experience in text processing, Vector databases and embedding models Experience in NLP, Transformers and Neural Networks Hands-on experience in using Langchain and good exposure to LLMOps Strong independent and creative research skills necessary to keep up with the latest trends in advanced analytics and ML Research, recommend and implement best practices of LLM and NLP systems that can scale Identify new process opportunities, while quickly assessing feasibility Excellent written and verbal communication skills able to effectively communicate technical details to a non-technical audience as well as produce clear and concise written documentation Even better if you have one or more of the following: Familiarity with Graph DB Concepts Familiarity with one or more Data Platforms - CloudEra, Snowflake, DataBricks etc. Experience in one or more Big data and ETL technologies - Informatica, Talend, Teradata, Hadoop etc. Experience in one or more BI platforms - Tableau, Looker, Thoughtspot etc. Masters degree from an accredited college or university preferred.

Posted 3 months ago

Apply

2 - 5 years

5 - 10 Lacs

Noida

Work from Office

Naukri logo

Preferred candidate profile We are seeking an experienced Salesforce Developer with a strong background in supporting nonprofits and/or Community Development Financial Institutions (CDFIs). The ideal candidate will be responsible for designing, developing, and implementing customized Salesforce solutions that drive operational efficiency and support mission-driven initiatives. This role requires expertise in Salesforce Nonprofit Success Pack (NPSP) or Financial Services Cloud (FSC), along with strong integration and automation skills. Role & responsibilities Develop, customize, and maintain Salesforce applications to support nonprofit or CDFI operations, including donor management, grant tracking, loan servicing, and client engagement. Design and implement Apex triggers, classes, Visualforce pages, and Lightning Web Components (LWC). Configure and enhance Salesforce Nonprofit Success Pack (NPSP) or Financial Services Cloud (FSC) to optimize workflows. Integrate Salesforce with external systems such as financial platforms, donor databases, loan management systems, and payment processors. Automate processes using Flows, Process Builder, and declarative tools to reduce manual tasks. Ensure data integrity, security, and compliance with industry standards and best practices. Work closely with fundraising, finance, and community development teams to align Salesforce solutions with organizational needs. Provide technical support, troubleshoot issues, and perform system maintenance and enhancements. Stay updated on Salesforce releases, nonprofit solutions, and CDFI regulations to recommend improvements. Required Qualification & Experience Bachelor's degree in computer science, Information Technology, or a related field (or equivalent experience). 2+ years of Salesforce development experience with a focus on nonprofit or financial services solutions. Strong proficiency in Apex, Visualforce, SOQL, SOSL, and Lightning Web Components (LWC). Experience with Salesforce Nonprofit Success Pack (NPSP) or Financial Services Cloud (FSC). Knowledge of Salesforce APIs (REST/SOAP) and integration with third-party applications. Understanding of fundraising, grant management, loan processing, and impact tracking within nonprofit or CDFI settings. Experience with Salesforce declarative tools (Flows, Process Builder, Reports, and Dashboards). Familiarity with Salesforce security models, role hierarchies, and compliance requirements. Salesforce Platform Developer I certification (Platform Developer II and Nonprofit Cloud Consultant certifications are a plus). Strong analytical and problem-solving skills with the ability to translate business needs into technical solutions. Excellent communication and collaboration skills, particularly with non-technical stakeholders. Preferred Skills: Experience with Salesforce.org solutions for nonprofits, including grant and impact management tools. Knowledge of CDFI operations, financial services regulations, and compliance reporting. Familiarity with third-party apps like Classy, Give Lively, iATS, or other nonprofit fundraising platforms. Experience with data migration and ETL tools like MuleSoft, Talend, or Data Loader. Knowledge of JavaScript, HTML, CSS, and front-end frameworks. Experience with Agile methodologies and project management tools (e.g., Jira, Asana, or Trello). Compensation Please note that the offer shall be commensurate with the qualifications, experience and shall be comparable with the industry norms.

Posted 3 months ago

Apply

10 - 15 years

35 - 40 Lacs

Hyderabad

Work from Office

Naukri logo

Responsibilities 1. Integration Strategy & Architecture Define the enterprise integration strategy , aligning with business goals and IT roadmaps. Design scalable, resilient, and secure integration architectures using industry best practices. Develop API-first and event-driven integration strategies. Establish governance frameworks, integration patterns, and best practices. 2. Technology Selection & Implementation Evaluate and recommend the right integration technologies , such as: Middleware & ESB: TIBCO, MuleSoft, WSO2, IBM Integration Bus Event Streaming & Messaging: Apache Kafka, RabbitMQ, IBM MQ API Management: Apigee, Kong, AWS API Gateway, MuleSoft ETL & Data Integration: Informatica, Talend, Apache NiFi iPaaS (Cloud Integration): Dell Boomi, Azure Logic Apps, Workato Lead the implementation and configuration of these platforms. 3. API & Microservices Architecture Design and oversee API-led integration strategies. Implement RESTful APIs, GraphQL, and gRPC for real-time and batch integrations. Define API security standards ( OAuth, JWT, OpenID Connect, API Gateway ). Establish API versioning, governance, and lifecycle management. 4. Enterprise Messaging & Event-Driven Architecture (EDA) Design real-time, event-driven architectures using: Apache Kafka for streaming and pub/sub messaging RabbitMQ, IBM MQ, TIBCO EMS for message queuing Event-driven microservices using Kafka Streams, Flink, or Spark Streaming Ensure event sourcing, CQRS, and eventual consistency in distributed systems. 5. Cloud & Hybrid Integration Develop hybrid integration strategies across on-premises, cloud, and SaaS applications . Utilize cloud-native integration tools like AWS Step Functions, Azure Event Grid, Google Cloud Pub/Sub. Integrate enterprise applications (ERP, CRM, HRMS) across SAP, Oracle, Salesforce, Workday . 6. Security & Compliance Ensure secure integration practices , including encryption, authentication, and authorization. Implement zero-trust security models for APIs and data flows. Maintain compliance with industry regulations ( GDPR, HIPAA, SOC 2 ). 7. Governance, Monitoring & Optimization Establish enterprise integration governance frameworks. Use observability tools for real-time monitoring (Datadog, Splunk, New Relic). Optimize integration performance and troubleshoot bottlenecks. 8. Leadership & Collaboration Collaborate with business and IT stakeholders to understand integration requirements. Work with DevOps and cloud teams to ensure CI/CD pipelines for integration. Provide technical guidance to developers, architects, and integration engineers. Qualifications Technical Skills Candidate should have 10+ years of experience Expertise in Integration Platforms: Informatica, TIBCO, MuleSoft, WSO2, Dell Boomi Strong understanding of API Management & Microservices Experience with Enterprise Messaging & Streaming (Kafka, RabbitMQ, IBM MQ, Azure Event Hub) Knowledge of ETL & Data Pipelines (Informatica, Talend, Apache NiFi, AWS Glue) Experience in Cloud & Hybrid Integration (AWS, Azure, GCP, OCI) Hands-on with Security & Compliance (OAuth2, JWT, SAML, API Security, Zero Trust) Soft Skills Strategic Thinking & Architecture Design Problem-solving & Troubleshooting Collaboration & Stakeholder Management Agility in Digital Transformation & Cloud Migration

Posted 3 months ago

Apply

3 - 6 years

5 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Minimum of 3 years of hands-on experience with SQL, specifically MySQL. Proficiency in any DataBricks and ETL tool (Talend, Informatica, Azure Data Factory, DataBricks) is required. Solid understanding of Python programming with a minimum of 1 year of experience. Familiarity with AWS or Azure is considered a strong advantage. Knowledge of Git for version control is preferred. Experience working with Tableau for data visualization is desirable. Familiarity with NoSQL databases is a plus. Minimum of 1 year of hands-on experience with Apache Spark.

Posted 3 months ago

Apply

3 - 5 years

10 - 20 Lacs

Pune, Hyderabad

Work from Office

Naukri logo

PharmaACE is a growing Global Healthcare Consulting Firm, headquartered in Princeton, New Jersey. Our expert teams of Business Analysts, based across the US, Canada, Europe, and India, provide Analytics and Business Solutions using our worldwide delivery models for a wide range of clients. Our clients include established, multinational BioPharma leaders and innovators, as well as entrepreneurial firms on the cutting edge of science. We have deep expertise in Forecasting, Business Analytics, Competitive Intelligence, Sales Analytics, and the Analytics Centre of Excellence Model. Our wealth of therapeutic area experience cuts across Oncology, Immuno- science, CNS, CV-Met, and Rare Diseases. We support our clients' needs in Primary Care, Specialty Care, and Hospital business units, and we have managed portfolios in the Biologics space, Branded Pharmaceuticals, Generics, APIs, Diagnostics, and Packaging & Delivery Systems. Responsibilities: • Working closely with Business teams/stakeholders across the pharmaceutical value chain and developing reports and dashboards that tell a story. • Recommending KPIs and helping generate custom analysis and insights. • Propose newer visualization ideas for our customers keeping in mind the audience type. • Designing Tableau dashboards and reports that are self-explanatory. • Keep the user at the center while designing the reports and thereby enhancing the user experience • Requirement gathering while working closely with our Global Clients. • Mentor other developers in the team for the Tableau-related technical challenges. • Propagate Tableau best practices within and across the team. • Ability to set up reports which can be maintained with ease and are scalable to other use cases. • Interacting with the AI/ML team and incorporating new ideas into the final deliverables for the client. • Work closely with cross teams like Advanced Analytics and Competitive Intelligence and Forecasting. • Develop and foster client relationships and serve as a point of contact for projects. Qualifications and Areas of Expertise: • Educational Qualification: BE/BTech/MTech/MCA from a reputed institute. • Minimum 3-5 years of experience. • Proficient with tools including Tableau Desktop, Tableau Server, MySQL, MS Excel, and ETL tools (Alteryx, Tableau Prep or Talend). • Knowledge of SQL. • Experience in advanced LOD calcs, custom visualizations, data cleaning and restructuring. • Strong analytical and problem-solving skills with the ability to question facts. • Excellent written and oral communication skills. Nice to have: • A Valid US Business Visa. • Hands-on experience in Tableau, Python and, R. • Hands-on experience on Qlik Sense and Power BI. • Experience with Pharma / Healthcare data.

Posted 3 months ago

Apply

7 - 12 years

22 - 37 Lacs

Chennai, Bangalore Rural, Hyderabad

Work from Office

Naukri logo

Stibo MDM,Stibo configuration, Business rule, Workflow, UI, Inbound and Outbound process,Product MDM experience,Integration experience using Java and/or Talend

Posted 3 months ago

Apply

4 - 7 years

10 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Key Responsibilities: Data Engineering & Process Automation: Develop and maintain ETL/ELT pipelines to process billing and payment data. Write efficient SQL scripts and Python processes for data transformation and automation. Optimize data workflows to ensure efficient and accurate data processing. Ensure data discoverability, integrity, and proper transfer for customer billing communications. Technical Documentation & Process Management: Create and maintain technical documentation related to data processes, design, and delivery of bill pay communications (e.g., Bill Ready, Disconnect alerts). Document data flow diagrams, data lineage, and system dependencies. Billing & Payment Data Processing: Understand and manage complex billing and payment data for both Mobile and Fixed services. Prevent errors in customer communication by ensuring data accuracy and proper trigger mechanisms . Work closely with business stakeholders to enhance billing and payment workflows . Cross-Functional Collaboration: Represent technical requirements in discussions with Billing, Collections, Retention, and other teams, if needed with 3rd party vendors . Provide insights on data design, triggers, and automation processes.

Posted 3 months ago

Apply

2 - 6 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Integration Architect - Supply Chain Planning Find endless opportunities to solve our clients toughest" challenges, as you work with exceptional people, the latest tech and leading companies across industries. Practice: Supply Chain and Operations, Industry Consulting, Global Network I Areas of Work: Supply Chain Management - Kinaxis RapidResponse Planning | Level: Consultant | Location: Bengaluru, Gurugram, Mumbai, Chennai, Hyderabad | Years of Exp: 2-6 years Explore an Exciting Career at Accenture Are you an outcome-oriented problem solver? Do you enjoy working on transformation strategies for global clients? Does working in an inclusive and collaborative environment spark your interest? Then, Strategy and Consulting Global Network SC&O is the right place for you to explore limitless possibilities. The Practice – A Brief Sketch As a part of our Supply chain and operations practice, you will help organizations reimagine and transform their supply chains for tomorrow"”with a positive impact on the business , society and the planet. Together, let's innovate, build competitive advantage, improve business, and societal outcomes, in an ever-changing, ever-challenging world. Help us make supply chains work better, faster, and be more resilient, with the following initiatives: Support clients and teams in the design, development and implementation of new and improved business processes, enabling technology in Supply Chain related projects. Involve in supply chain planning process and requirement discussions with the client to configure the data structure or data model accordingly. Work with the client in the design, development and testing of the supply chain implementation projects. Design apt solutions by considering the inbuilt as well as configurable capabilities of Kinaxis RapidResponse . Work with the client team to understand the system landscape. Perform workshops with single point of contacts of each legacy system which is getting integrated with Kinaxis RapidResponse . Provide data specification documents based on Kinaxis Rapid response configuration. Create Namespace or Tables based on client's current data flow. Create transformation workbooks and design test scripts for configuration testing, and train integration team on client business solution. Ensure RapidResponse gets integrated with client's systems. Bring your best skills forward to excel in the role: Excellent authoring skills and ability to independently build resources Ability to solve complex business problems and deliver client delight Strong analytical and writing skills to build viewpoints on industry trends Excellent communication, interpersonal and presentation skills Cross-cultural competence with an ability to thrive in a dynamic environment Read more about us. Recent Blogs Qualifications Your experience counts! MBA from Tier-1 B-school 5 + years of experience of working as a Integration architect on Kinaxis RapidResponse E nd - to -e nd i mplementation experience as Integration Architect Should have experience on Data Integration server or Talend tool Experience across industries such as Life sciences, Auto, Consumer Packaged Goods , preferred Knowledge of different Scheduled task and Scripts required to set up for the consistent flow of data Have a good understanding of Extraction, Transformation, Load concepts to proactively troubleshoot the Integration issues Experience in managing the implementation of Kinaxis RapidResponse Administrator and coordinating different key stakeholders within same project Must be a certified RapidResponse Administrator, level 2 What’s in it for you? An opportunity to work on transformative projects with key G2000 clients Potential to Co-create with leaders in strategy, industry experts, enterprise function practitioners and, business intelligence professionals to shape and recommend innovative solutions that leverage emerging technologies. Ability to embed responsible business into everything—from how you service your clients to how you operate as a responsible professional. Personalized training modules to develop your strategy & consulting acumen to grow your skills, industry knowledge and capabilities Opportunity to thrive in a culture that is committed to accelerate equality for all. Engage in boundaryless collaboration across the entire organization. About Accenture: Accenture is a leading global professional services company, providing a broad range of services and solutions in strategy, consulting, digital, technology and operations. Combining unmatched experience and specialized skills across more than 40 industries and all business functions — underpinned by the world’s largest delivery network — Accenture works at the intersection of business and technology to help clients improve their performance and create sustainable value for their stakeholders. With XX people serving clients in more than 120 countries, Accenture drives innovation to improve the way the world works and lives. Visit us at www.accenture.com About Accenture Strategy & Consulting: Accenture Strategy shapes our clients’ future, combining deep business insight with the understanding of how technology will impact industry and business models. Our focus on issues such as digital disruption, redefining competitiveness, operating and business models as well as the workforce of the future helps our clients find future value and growth in a digital world. Today, digital is changing the way organizations engage with their employees, business partners, customers and communities. This is our unique differentiator. To bring this global perspective to our clients, Accenture Strategy's services include those provided by our Capability Network – a distributed management consulting organization that provides management consulting and strategy expertise across the client lifecycle. Our Capability Network teams complement our in-country teams to deliver cutting-edge expertise and measurable value to clients all around the world. For more information visit https://www.accenture.com/us-en/Careers/capability-network Accenture Global Network SC&O | Accenture in One Word At the heart of every great change is a great human. If you have ideas, ingenuity and a passion for making a difference, come and be a part of our team .

Posted 3 months ago

Apply

4 - 7 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Responsibilities 1. Integration Strategy & Architecture Define the enterprise integration strategy , aligning with business goals and IT roadmaps. Design scalable, resilient, and secure integration architectures using industry best practices. Develop API-first and event-driven integration strategies. Establish governance frameworks, integration patterns, and best practices. 2. Technology Selection & Implementation Evaluate and recommend the right integration technologies , such as: Middleware & ESB: TIBCO, MuleSoft, WSO2, IBM Integration Bus ETL & Data Integration: Informatica, Talend, Apache NiFi Event Streaming & Messaging: Apache Kafka, RabbitMQ, IBM MQ API Management: Apigee, Kong, AWS API Gateway, MuleSoft iPaaS (Cloud Integration): Dell Boomi, Azure Logic Apps, Workato Lead the implementation and configuration of these platforms. 3. API & Microservices Architecture Design and oversee API-led integration strategies. Implement RESTful APIs, GraphQL, and gRPC for real-time and batch integrations. Define API security standards ( OAuth, JWT, OpenID Connect, API Gateway ). Establish API versioning, governance, and lifecycle management. 4. Enterprise Messaging & Event-Driven Architecture (EDA) Design real-time, event-driven architectures using: Apache Kafka for streaming and pub/sub messaging RabbitMQ, IBM MQ, TIBCO EMS for message queuing Event-driven microservices using Kafka Streams, Flink, or Spark Streaming Ensure event sourcing, CQRS, and eventual consistency in distributed systems. 5. Cloud & Hybrid Integration Develop hybrid integration strategies across on-premises, cloud, and SaaS applications . Utilize cloud-native integration tools like AWS Step Functions, Azure Event Grid, Google Cloud Pub/Sub. Integrate enterprise applications (ERP, CRM, HRMS) across SAP, Oracle, Salesforce, Workday . 6. Security & Compliance Ensure secure integration practices , including encryption, authentication, and authorization. Implement zero-trust security models for APIs and data flows. Maintain compliance with industry regulations ( GDPR, HIPAA, SOC 2 ). 7. Governance, Monitoring & Optimization Establish enterprise integration governance frameworks. Use observability tools for real-time monitoring (Datadog, Splunk, New Relic). Optimize integration performance and troubleshoot bottlenecks. 8. Leadership & Collaboration Collaborate with business and IT stakeholders to understand integration requirements. Work with DevOps and cloud teams to ensure CI/CD pipelines for integration. Provide technical guidance to developers, architects, and integration engineers. Qualifications Required Skills & Expertise Technical Skills Expertise in Integration Platforms: Informatica, TIBCO, MuleSoft, WSO2, Dell Boomi Strong understanding of API Management & Microservices Experience with Enterprise Messaging & Streaming (Kafka, RabbitMQ, IBM MQ, Azure Event Hub) Knowledge of ETL & Data Pipelines (Informatica, Talend, Apache NiFi, AWS Glue) Experience in Cloud & Hybrid Integration (AWS, Azure, GCP, OCI) Hands-on with Security & Compliance (OAuth2, JWT, SAML, API Security, Zero Trust) Soft Skills Strategic Thinking & Architecture Design Problem-solving & Troubleshooting Collaboration & Stakeholder Management Agility in Digital Transformation & Cloud Migration

Posted 3 months ago

Apply

5 - 8 years

0 - 1 Lacs

Bengaluru

Work from Office

Naukri logo

Bachelors degree in computer science, Engineering, Information Systems, or related field; masters degree preferred. Minimum of 6 years of experience in data integration, ETL development, or related field. Proficiency in PL/SQL with expertise in designing Stored Procedures, Functions etc. Proficiency in Talend Data Integration and Talend Administration Console , with experience in designing and implementing ETL processes. Strong understanding of ETL concepts, methodologies, and best practices, including data modeling, schema design, and data warehousing principles. Experience working with relational databases (e.g., MySQL, PostgreSQL, Oracle) and NoSQL databases (e.g., MongoDB, Cassandra). Knowledge of data governance, data security, and regulatory compliance requirements. Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud) and containerization technologies (e.g., Docker, Kubernetes) is a plus. Excellent analytical, problem-solving, and communication skills. Ability to work effectively in a fast-paced, dynamic environment with tight deadlines. Talend certification (e.g., Talend Data Integration Certified Developer) preferred.

Posted 3 months ago

Apply

8 - 13 years

12 - 22 Lacs

Bengaluru

Remote

Naukri logo

Job Description: Snowflake Architect - Remote Opportunity (8+ Years of Experience) Position Overview: We are seeking an experienced and highly skilled Snowflake Architect to join our team in a fully remote capacity. As a Snowflake Architect, you will be responsible for designing, implementing, and optimizing data solutions leveraging the Snowflake Data Cloud. You will collaborate with cross-functional teams to deliver scalable, high-performance data architectures that support analytics, reporting, and data-driven decision-making. This is a strategic role that requires deep expertise in Snowflake, data warehousing, and cloud architecture, along with strong leadership and communication skills. Roles and Responsibilities: Solution Architecture: Design and implement end-to-end data architecture solutions using Snowflake. Define best practices for Snowflake architecture, including schema design, security, and performance optimization. Develop scalable data pipelines and workflows to process and transform data efficiently. Implementation and Development: Lead the migration of legacy data warehouses to Snowflake. Implement Snowflake features such as Virtual Warehouses, Secure Data Sharing, and Snowpipe for real-time data ingestion. Create and optimize SQL queries, stored procedures, and user-defined functions for analytics and reporting. Performance and Optimization: Monitor and improve Snowflake performance through workload management, clustering, and caching strategies. Optimize data models and query execution plans to ensure high performance and scalability. Integration and Automation: Integrate Snowflake with ETL/ELT tools such as Informatica, Talend, DBT, or Apache Airflow. Enable seamless data exchange with other platforms like AWS, Azure, GCP, and BI tools (e.g., Tableau, Power BI). Data Governance and Security: Implement robust data governance policies, including access control, data masking, and encryption. Ensure compliance with regulatory standards and best practices for data security. Collaboration and Leadership: Collaborate with data engineers, analysts, and business stakeholders to define requirements and deliver solutions. Provide technical guidance and mentorship to junior team members. Documentation and Support: Document data architecture, processes, and workflows to ensure clarity and maintainability. Provide ongoing support for Snowflake environments and troubleshoot issues as needed. Required Technical Skill Set: Snowflake Expertise: Strong experience with Snowflake Data Cloud (minimum 35 years). Proficiency in Snowflake features like SnowSQL, Time Travel, Data Sharing, and Materialized Views. Data Warehousing and Modeling: Extensive experience in data warehousing concepts, star and snowflake schemas, and dimensional modeling. Strong understanding of ETL/ELT processes and tools. Programming and Scripting: Proficiency in SQL and scripting languages like Python or Java. Experience with data orchestration frameworks like Apache Airflow or DBT. Cloud Platforms: Hands-on experience with cloud platforms like AWS, Azure, or Google Cloud, and their integration with Snowflake. Performance Tuning: Expertise in query optimization, clustering strategies, and workload management in Snowflake. Data Integration: Experience integrating Snowflake with BI tools (e.g., Tableau, Power BI, Qlik) and data lakes. Preferred Skills and Certifications: Certifications: Snowflake SnowPro Core Certification (highly preferred). AWS Certified Data Analytics, Google Professional Data Engineer, or Azure Data Engineer certifications. Additional Skills: Familiarity with machine learning workflows using Snowflake. Knowledge of NoSQL databases and Big Data technologies like Hadoop or Spark. Required Qualifications: Bachelors or Master’s degree in Computer Science, Data Science, or a related field. 8+ years of experience in data architecture, with a focus on Snowflake and cloud data platforms.

Posted 3 months ago

Apply

10 - 15 years

27 - 32 Lacs

Pune

Work from Office

Naukri logo

Position: Project Manager-ETL EXP : 10-15Years Budget: 28LPA -32LPA Location : Pune Notice Period: Immediate to 15 Days 8409250974 Required Candidate profile Mandatory Skills: ETL (SSIS/Informatica/Talend/Abinitio) AND Azure (ADF/Synapse/Databricks) AND Migration project experience (1+ migrations to be done) Good to have: Fixed price project experience

Posted 3 months ago

Apply

12 - 14 years

15 - 30 Lacs

Pune

Work from Office

Naukri logo

Mandatory Skills: • On-premise ETL Tools: Informatica, Datastage, Talend, SSIS • Cloud ETL Tools: Azure Data Factory (ADF) or Databricks • Data Architecture & Azure Architecture • Snowflake Dipika 8409250974

Posted 3 months ago

Apply

7 - 12 years

9 - 14 Lacs

Pune

Work from Office

Naukri logo

Around 7 years of experience in software development with strong experience on Talend, SQL, and Unix development.Expertise in designing and implementing ETL pipelines using Talend.In-depth understanding of SQL queries, stored procedures, and data modeling concepts.Experience with data warehousing conceptsStrong understanding of Unix/Linux commands and scripting.Knowledge of data warehouse and data lake concepts.Familiarity with cloud-based data services is a plus (e.g., Snowflake, AWS Redshift).Excellent troubleshooting and debugging skills for ETL and data pipelines.Experience with data quality and validation toolsExperience with Python or other scripting languages

Posted 3 months ago

Apply

6 - 11 years

8 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Design, develop, and maintain ETL processes using Ab Initio and other ETL tools. Manage and optimize data pipelines on AWS. Write and maintain complex PL/SQL queries for data extraction, transformation, and loading. Provide Level 3 support for ETL processes, troubleshooting and resolving issues promptly. Collaborate with data architects, analysts, and other stakeholders to understand data requirements and deliver effective solutions. Ensure data quality and integrity through rigorous testing and validation. Stay updated with the latest industry trends and technologies in ETL and cloud computing. Requirements: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Certification in Ab Initio. Proven experience with AWS and cloud-based data solutions. Strong proficiency in PL/SQL and other ETL tools. Experience in providing Level 3 support for ETL processes. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities. Preferred Qualifications: Experience with other ETL tools such as Informatica, Talend, or DataStage. Knowledge of data warehousing concepts and best practices. Familiarity with scripting languages (e.g., Python, Shell scripting).

Posted 3 months ago

Apply

10 - 15 years

7 - 10 Lacs

Chennai

Work from Office

Naukri logo

Job description Talend - Designing, developing, and documenting existing Talend ETL processes, technical architecture, data pipelines, and performance scaling using tools to integrate Talend data and ensure data quality in a big data environment. Snowflake SQL Writing SQL queries against Snowflake Developing scripts Unix, Python, etc. to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Perform data analysis, troubleshoot data issues, and provide technical support to end-users. Develop and maintain data warehouse and ETL processes, ensuring data quality and integrity. Complex problem-solving capability and ever improvement approach. Desirable to have Talend Snowflake Certification Excellent SQL coding skills Excellent communication & documentation skills. Familiar with Agile delivery process. Must be analytic, creative and self-motivated. Work Effectively within a global team environment. Excellent Communication skills

Posted 3 months ago

Apply

6 - 11 years

8 - 13 Lacs

Hyderabad

Work from Office

Naukri logo

We are looking for a detail-oriented QA Engineer to ensure the quality and accuracy of data migration projects. The ideal candidate will be responsible for validating data integrity, testing migration processes, and identifying discrepancies or issues. This role requires expertise in QA methodologies, strong analytical skills, and familiarity with data migration processes and tools. Key Responsibilities Data Validation and Testing: Develop and execute comprehensive test plans and test cases to validate data migration processes. Ensure data integrity, accuracy, and consistency across source and target systems. Perform pre and post-migration data checks to verify successful migration. Test Automation Design and implement automated test scripts for data validation and reconciliation. Use appropriate tools to streamline testing processes and reduce manual effort. Defect Identification and Resolution Identify, document, and report issues or discrepancies in the data migration process. Collaborate with development teams to troubleshoot and resolve data-related defects. Collaboration and Communication Work closely with data engineers, business analysts, and stakeholders to understand migration requirements and objectives. Provide regular updates on testing progress, results, and identified risks. Process Improvement Recommend and implement best practices for data migration testing and validation. Continuously improve QA processes to enhance efficiency and effectiveness. Documentation Maintain clear and detailed documentation of test plans, test cases, and test results. Ensure proper tracking and reporting of issues using defect management tools. Requirements Bachelors degree in Computer Science, Information Technology, or a related field. 3+ years of experience in quality assurance or data testing, preferably in data migration projects. Strong knowledge of SQL for querying and validating data. Familiarity with data migration tools and ETL processes (e.g., Informatica, Talend, or similar). Hands-on experience with test automation tools (e.g., Selenium, TestNG, or similar). Understanding of data governance, privacy, and security principles. Strong analytical skills with attention to detail. Excellent communication and collaboration abilities. Preferred Qualifications: Experience with cloud-based data migration (e.g., AWS, Azure, GCP). Familiarity with big data frameworks and tools (e.g., Hadoop, Spark). Knowledge of Agile methodologies and tools like Jira or Confluence. Qualification Education- ANy degree or equivalent Experience- 6+ years

Posted 3 months ago

Apply

5 - 10 years

4 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

Job Description: Looking for an ETL developer to work on Talend technology platform. Key Responsibilities: Understand requirements from both business as well as technology clients - offering timely solutions, developing, testing, and implementing these solutions Contribute to the development of the next-gen Alternative Investments Tech landscape for the competitive advantage, Supporting and enhancing the existing applications Interact with the Testers, development team and other stakeholders to provide coordination for all special (post-production validation, etc.) testing activities. Provide L3 support to the business from a development perspective. Technical Skills: Strong in Talend development, Unix shell scripts, Autosys scheduler Database Programming (Oracle, Mongo DB) Experience with tools like: Bitbucket, git, uDeploy, RLM, Jira etc. Soft Skills: Strong communication skills Team player, Collaborative Ability to work under aggressive timelines Persistence - ability to get things done Experience or interest in Agile development Multitasking on a number of projects in parallel Innovative; challenges the status quo Competencies: Experience with Investments products (Alternative Investments like Private Equity, Hedge Funds) Ability to own and drive tasks/assignments while collaborating with peers and other Technology teams Balancing fast time to market (client delivery) with proper Governance, Compliance, and Audit mandates Ability to work closely with teams across the globe for project development and delivery Qualifications: 4-6 years of relevant experience in Apps Development and systems analysis role Extensive experience system analysis and in programming of software applications Subject Matter Expert (SME) in at least one area of Applications Development Ability to adjust priorities quickly as circumstances dictate Consistently demonstrates clear and concise written and verbal communication

Posted 3 months ago

Apply

7 - 12 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Job description 4+ years of experience in Talend, SQL, and Unix. Hands-on experience in developing ETL jobs and workflows using Talend. Strong SQL skills, including the ability to write complex queries and optimize performance. Basic knowledge of Unix/Linux shell scripting. Familiarity with data integration processes and tools. Strong communication and teamwork skills. Experience with data warehousing concepts Good problem-solving and analytical skills Experience with data quality and validation tools Experience with Python or other scripting languages

Posted 3 months ago

Apply

4 - 9 years

6 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Job Purpose and Impact As a Data Engineer at Cargill you work across the full stack to design, develop and operate high performance and data centric solutions using our comprehensive and modern data capabilities and platforms. You will play a critical role in enabling analytical insights and process efficiencies for Cargills diverse and complex business environments. You will work in a small team that shares your passion for building innovative, resilient, and high quality solutions while sharing, learning and growing together. Key Accountabilities Collaborate with business stakeholders, product owners and across your team on product or solution designs. Develop robust, scalable and sustainable data products or solutions utilizing cloud based technologies. Provide moderately complex technical support through all phases of product or solution life cycle. Perform data analysis, handle data modeling and configure and develop data pipelines to move and optimize data assets. Build moderately complex prototypes to test new concepts and provide ideas on reusable frameworks, components and data products or solutions and help promote adoption of new technologies. Independently solve moderately complex issues with minimal supervision, while escalating more complex issues to appropriate staff. Other duties as assigned Qualifications MINIMUM QUALIFICATIONS Bachelors degree in a related field or equivalent experience Minimum of two years of related work experience Other minimum qualifications may apply PREFERRED QUALIFCATIONS Experience developing modern data architectures, including data warehouses, data lakes, data meshes, hubs and associated capabilities including ingestion, governance, modeling, observability and more. Experience with data collection and ingestion capabilities, including AWS Glue, Kafka Connect and others. Experience with data storage and management of large, heterogenous datasets, including formats, structures, and cataloging with such tools as Iceberg, Parquet, Avro, ORC, S3, HFDS, HIVE, Kudu or others. Experience with transformation and modeling tools, including SQL based transformation frameworks, orchestration and quality frameworks including dbt, Apache Nifi, Talend, AWS Glue, Airflow, Dagster, Great Expectations, Oozie and others Experience working in Big Data environments including tools such as Hadoop and Spark Experience working in Cloud Platforms including AWS, GCP or Azure Experience of streaming and stream integration or middleware platforms, tools, and architectures such as Kafka, Flink, JMS, or Kinesis. Strong programming knowledge of SQL, Python, R, Java, Scala or equivalent Proficiency in engineering tooling including docker, git, and container orchestration services Strong experience of working in devops models with demonstratable understanding of associated best practices for code management, continuous integration, and deployment strategies. Experience and knowledge of data governance considerations including quality, privacy, security associated implications for data product development and consumption

Posted 3 months ago

Apply

7 - 12 years

15 - 25 Lacs

Navi Mumbai

Work from Office

Naukri logo

Role & responsibilities Design, develop, and deploy interactive dashboards, reports, and data visualizations using Power BI to support business decision-making. • Extract, transform, and load (ETL) data from various sources (e.g., SQL databases, Excel, APIs, cloud platforms) for analysis and reporting. • Collaborate with business stakeholders to gather requirements, identify key performance indicators (KPIs), and deliver tailored BI solutions. • Optimize data models to ensure performance, scalability, and accuracy of reporting outputs. • Integrate AI analytics capabilities into data and reporting services • Maintain and enhance existing BI solutions, troubleshoot issues, and ensure data integrity and consistency. • Integrate Power BI with other tools and platforms as needed. • Leverage AI services or Power BIs built-in AI features (e.g., Key Influencers, Q&A) to uncover deeper insights and automate reporting processes. • Stay current with industry trends, Power BI updates, and advancements in AI analytics to recommend improvements and implement best practices. • Provide training and support to end-users to maximize adoption and effective use of BI tools, including AI-enhanced features. • Ensure data accuracy, integrity, privacy, security, and compliance through quality control procedures. Preferred candidate profile Bachelors degree in Computer Science, Information Systems, Data Science, or a related field. • 7+ years of experience in business intelligence, data analytics, or a similar role. • Proficiency in BI tools (e.g., Power BI, Tableau, QlikView) and data visualization best practices. • Strong knowledge of SQL and experience with relational databases (e.g., MySQL, SQL Server, PostgreSQL). • Experience with ETL processes and tools (e.g., SSIS, Talend, Informatica). • Familiarity with data warehousing concepts and technologies (e.g., Snowflake, Redshift). • Basic programming skills (e.g., Python, R) are a plus. • Excellent problem-solving skills and attention to detail. • Strong communication skills to liaise with technical and non-technical stakeholders. • Experience with AWS cloud platform • Knowledge of statistical analysis or machine learning concepts. Perks and benefits

Posted 3 months ago

Apply

9 - 14 years

11 - 17 Lacs

Pune

Work from Office

Naukri logo

About The Role : Primary Skills (must have): Experience in crafting deals/presales in Business Intelligence, DWH & Bigdata Analytics Having good understanding of Banking & Financial services domain Having good understanding of FINTECHs and their roles in Analytics ecosystem Strong communication and customer facing skills; Ability to interact with customer business SMEs Having knowledge of market leading BI & EDW, Big data, AI, Analytics, Cloud and MDM tools/solutions Knowledge of Data Science and solutions Knowledge of Cloud and/or Data warehouse technologies - AWS/Azure/GCP or any of the Informatica / Datastage / Talend / Hadoop/Spark/Python / R Any ESG project or presales experience is an added advantage Responsibilities include: Working as Account/Sales/presales team crafting proactive solution, converting leads to opportunities Solutions Conceptualization, Go To Market planning along with Sales & Marketing team Liaise with the other internal teams & senior management to meet the review requirements prior to the bid/presales deals submission Present proposal defense, solution walkthroughs and consulting advice to clients Present sales presentations and manage client visits Analyze market trends and propose new solution accelerators/ reusable frameworks/ industry solutions to COE; Conduct road shows and events with customers in different regions; Clearly articulate the requirements and Produce detailed estimates using the Customer and Capgemini estimation methodologies and templates Liaise with Capgemini Market Intelligence team and Product Alliance team for RFP responses Works in the area of Software Engineering, which encompasses the development, maintenance and optimization of software solutions/applications.1. Applies scientific methods to analyse and solve software engineering problems.2. He/she is responsible for the development and application of software engineering practice and knowledge, in research, design, development and maintenance.3. His/her work requires the exercise of original thought and judgement and the ability to supervise the technical and administrative work of other software engineers.4. The software engineer builds skills and expertise of his/her software engineering discipline to reach standard software engineer skills expectations for the applicable role, as defined in Professional Communities.5. The software engineer collaborates and acts as team player with other software engineers and stakeholders. About The Role : - Grade Specific Skills (competencies) Verbal Communication

Posted 3 months ago

Apply

2 - 6 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Design, construct, install, test, and maintain highly scalable data management systems using big data technologies such as Apache Spark (with focus on Spark SQL) and Hive. Manage and optimize our data warehousing solutions, with a strong emphasis on SQL performance tuning. Implement ETL/ELT processes using tools like Talend or custom scripts, ensuring efficient data flow and transformation across our systems. Utilize AWS services including S3, EC2, and EMR to build and manage scalable, secure, and reliable cloud-based solutions. Develop and deploy scripts in Linux environments, demonstrating proficiency in shell scripting. Utilize scheduling tools such as Airflow or Control-M to automate data processes and workflows. Implement and maintain metadata-driven frameworks, promoting reusability, efficiency, and data governance. Collaborate closely with DevOps teams utilizing SDLC tools such as Bamboo, JIRA, Bitbucket, and Confluence to ensure seamless integration of data systems into the software development lifecycle. Communicate effectively with both technical and non-technical stakeholders, for handover, incident management reporting, etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Demonstrated expertise in Big Data Technologies, specifically Apache Spark (focus on Spark SQL) and Hive. Extensive experience with AWS services, including S3, EC2, and EMR. Strong expertise in Data Warehousing and SQL, with experience in performance optimization Experience with ETL/ELT implementation (such as Talend) Proficiency in Linux, with a strong background in shell scripting Preferred technical and professional experience Familiarity with scheduling tools like Airflow or Control-M. Experience with metadata-driven frameworks. Knowledge of DevOps tools such as Bamboo, JIRA, Bitbucket, and Confluence. Excellent communication skills and a willing attitude towards learning

Posted 3 months ago

Apply

10 - 14 years

16 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Job Description: We are seeking an experienced Data Architect to design, build, and optimize scalable data solutions. The ideal candidate will have deep expertise in AWS Glue, ETL pipelines/Tools (like Talend), EDI interfaces, SQL, data modeling, and data analysis, ensuring efficient data processing and integration across various platforms. Roles and Responsibilities Key Responsibilities: Architect and implement data solutions using AWS Glue and other AWS services. Understand requirements, design and optimize ETL (Extract, Transform, Load) pipelines to support data ingestion, transformation, and integration. Develop and maintain data models for structured and semi-structured data, ensuring scalability and performance. Write, optimize, and manage SQL queries and stored procedures for data processing and analytics. Conduct data analysis to support business intelligence and decision-making. Ensure data quality, governance, and security best practices are followed. Collaborate with data engineers, analysts, and business stakeholders to define data requirements and architecture. Monitor and troubleshoot data pipelines to maintain system reliability and performance. Review work products, provide guidance to data integration developers, analysts. Interface with onshore architects, customers to provide solution overview and estimates. Required Skills & Qualifications: 10+ years of experience in data architecture, data engineering, or a related field. Strong expertise in AWS Glue, AWS Lambda, AWS S3, AWS IAM, and other AWS data services. Hands-on experience with ETL development and data pipeline orchestration. Proficiency in SQL and database technologies such as PostgreSQL, MySQL, Redshift, or Snowflake. Solid understanding of data modeling (relational and dimensional) and schema design. Experience with Python or Scala for data transformation and automation. Knowledge of data governance, security, and compliance best practices. Strong problem-solving skills and ability to work in a fast-paced, agile environment. Preferred Qualifications: Exposure to machine learning pipelines and data lake architectures. Experience with Talend Data Integration AWS Certifications such as AWS Certified Data Analytics-Specialty or AWS Certified Solutions Architect Exposure to Azure Data Integration technologies

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies