Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 8.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the solutions align with business objectives. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking opportunities for improvement and innovation in application development. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling and ETL processes.- Experience with SQL and database management.- Familiarity with cloud computing concepts and services.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 3 years of experience in Snowflake Data Warehouse.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with data modeling and database design.- Strong understanding of ETL processes and data integration techniques.- Familiarity with cloud platforms and services related to data storage and processing.- Experience in performance tuning and optimization of data queries. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Oracle Data Integrator (ODI) Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications function seamlessly within the existing infrastructure. You will also engage in troubleshooting and optimizing application performance, while documenting your work to maintain clarity and facilitate future enhancements. Your role will require you to stay updated with the latest technologies and methodologies to ensure that the applications you develop are efficient and effective in meeting user needs. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Oracle Data Integrator (ODI).- Strong understanding of data integration techniques and ETL processes.- Experience with database management systems and SQL.- Familiarity with application development methodologies and frameworks.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 3 years of experience in Oracle Data Integrator (ODI).- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
6.0 - 10.0 years
4 - 8 Lacs
Pune
Work from Office
Position Overview Summary: The Data Engineer will expand and optimize the data and data pipeline architecture, as well as optimize data flow and collection for cross functional teams. The Data Engineer will perform data architecture analysis, design, development and testing to deliver data applications, services, interfaces, ETL processes, reporting and other workflow and management initiatives. The role also will follow modern SDLC principles, test driven development and source code reviews and change control standards in order to maintain compliance with policies. This role requires a highly motivated individual with strong technical ability, data capability, excellent communication and collaboration skills including the ability to develop and troubleshoot a diverse range of problems. Responsibilities Design and develop enterprise data data architecture solutions using Hadoop and other data technologies like Spark, Scala.
Posted 1 month ago
4.0 - 6.0 years
6 - 8 Lacs
Bengaluru
Work from Office
Responsibilities: Collaborate with business stakeholders to understanddatarequirements and objectives. Analyse complexdatasets to extract valuable insights and trends. Interpretdatato identify opportunities for process improvements and business growth. Develop reports, dashboards, and visualisations to communicate findings effectively. Conductdataquality assessments and ensuredataintegrity across systems. Supportdata-driven decision-making by providing actionable recommendations. Work with cross-functional teams to implementdatasolutions that meet business needs. Stay informed on industry trends and best practices indataanalysis and business intelligence Qualifications: Bachelor's degree in Business Administration, Statistics, Computer Science, or related field. Proven experience as a BusinessAnalystorDataAnalystin a corporate setting. Proficiency indataanalysis tools (e.g., SQL, Excel, Amazon QuickSight) anddatabase querying. Strong analytical skills with the ability to translate complexdatainto actionable insights. Excellent communication and presentation skills. Experience withdatavisualization tools and techniques. Knowledge of business processes and operations. Preferred Skills: Familiarity with ETL processes anddatawarehousing concepts. Experience with predictive analytics and statistical modelling. Certification in business analysis or related field is a plus. Knowledge of Agile methodologies for project management.
Posted 1 month ago
3.0 - 8.0 years
5 - 10 Lacs
Pune
Work from Office
The AI/Data Engineer will be responsible for designing, implementing, and maintaining scalable data solutions. This role will involve working with various data tools and technologies to ensure efficient data processing, integration, and visualization. Key Responsibilities: Develop and maintain data pipelines and ETL processes to ingest and transform data from multiple sources. Design, implement, and manage data models and databases using SQL. Utilize Python for data manipulation, analysis, and automation tasks. Administer and automate processes on Linux systems using shell scripting and tools like Putty. Schedule and monitor jobs using Control-M or similar scheduling tools. Create interactive dashboards and reports using Tableau and Power BI to support data-driven decision-making. Collaborate with data scientists and analysts to support AI/ML model deployment and integration. Ensure data quality, integrity, and security across all data processes. Utilize version control software, such as Git and Bitbucket, to manage and track code changes effectively. Qualifications: Bachelor’s degree in computer science, Information Technology, or a related field. At least 3 years of proven experience as a Data Engineer, AI Engineer, or similar role. Proficiency in Python and SQL.
Posted 1 month ago
6.0 - 11.0 years
8 - 14 Lacs
Gurugram
Work from Office
As a BI ETL Test Engineer, you take care of the testing of BI systems. This includes validation of the business data flow , ETL components, data lineage, ETL architecture and you are able to analyze defects during data validation. This includes setting up testing strategy, recommend tools, perform technical feasibility and risk assessments. Primary Skills As a BI ETL Test Specialist, you are expected to be subject matter expert in this area of specialised testing. This includes understanding the business data flow , ETL components, data lineage, ETL architecture and are able to analyze defects during data validation. You have a good technical knowledge on Databases, Unix/Linux and ETL and BI Tools. You are expected to develop testing strategy, recommend tools, perform technical feasibility, conduct risk assessments and build business cases (ROI). You are expected to own delivery of specialised testing projects. You are expected to work independently to provide technical support and guidance. Skills (competencies)
Posted 1 month ago
10.0 - 15.0 years
12 - 17 Lacs
Hyderabad
Work from Office
Skillset: In-depth knowledge of Azure Synapse Analytics (with dedicated pools) Proficient in Azure Data Factory (ADF) for ETL processes Strong SQL skills for complex queries and data manipulation Knowledge of data warehousing and big data analytics Good analytical and problem-solving skils.
Posted 1 month ago
6.0 - 11.0 years
8 - 14 Lacs
Pune
Work from Office
Responsibilities: designing, developing, and maintaining scalable data pipelines using Databricks, PySpark, Spark SQL, and Delta Live Tables. Collaborate with cross-functional teams to understand data requirements and translate them into efficient data models and pipelines. Implement best practices for data engineering, including data quality, and data security. Optimize and troubleshoot complex data workflows to ensure high performance and reliability. Develop and maintain documentation for data engineering processes and solutions. Requirements: Bachelor's or Master's degree. Proven experience as a Data Engineer, with a focus on Databricks, PySpark, Spark SQL, and Delta Live Tables. Strong understanding of data warehousing concepts, ETL processes, and data modelling. Proficiency in programming languages such as Python and SQL. Experience with cloud platforms (e.g., AWS, Azure, GCP) and their data services. Excellent problem-solving skills and the ability to work in a fast-paced environment. Strong leadership and communication skills, with the ability to mentor and guide team members.
Posted 1 month ago
6.0 - 11.0 years
3 - 7 Lacs
Karnataka
Hybrid
PF Detection is mandatory : Looking for a candidate with over 6 years of hands-on involvement in Snowflake. The primary expertise required is in Snowflake, must be capable of creating complex SQL queries for manipulating data. The candidate should excel in implementing complex scenarios within Snowflake. The candidate should possess a strong foundation in Informatica PowerCenter, showcasing their proficiency in executing ETL processes. Strong hands-on experience in SQL and RDBMS Strong hands-on experience in Unix Shell Scripting Knowledge in Data warehousing and cloud data warehousing Should have good communication skills
Posted 1 month ago
7.0 - 12.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Hi ,Greetings From IDESLABS.This is Navyafrom IDESLABS, we have a requirement on Etl Testing for one of our clients for contract to Hire role. job Details: skillsEtl TestingExperience7+ YearsLocationBangaloreJob typeContract to HirePay roll companyIDESLABSWork ModelHybrid JD JD with Primary Skill ETL testing /Strong SQL Looking for ETL/DB tester with 5+ years of experience. Should have strong SQL skills. Should have hands on coding knowledge in any scripting language. should be able to design and write SQL queries for data validation should be able to verify and test ETL processes understanding of Data warehousing concepts is a plus good communication skill and testing mindset
Posted 1 month ago
3.0 - 6.0 years
6 - 9 Lacs
Hyderabad
Work from Office
"Spark & Delta Lake Understanding of Spark core concepts like RDDs, DataFrames, DataSets, SparkSQL and Spark Streaming. Experience with Spark optimization techniques. Deep knowledge of Delta Lake features like time travel, schema evolution, data partitioning. Ability to design and implement data pipelines using Spark and Delta Lake as the data storage layer. Proficiency in Python/Scala/Java for Spark development and integrate with ETL process. Knowledge of data ingestion techniques from various sources (flat files, CSV, API, database) Understanding of data quality best practices and data validation techniques. Other Skills: Understanding of data warehouse concepts, data modelling techniques. Expertise in Git for code management. Familiarity with CI/CD pipelines and containerization technologies. Nice to have experience using data integration tools like DataStage/Prophecy/Informatica/Ab Initio"
Posted 1 month ago
7.0 - 12.0 years
6 - 9 Lacs
Hyderabad
Work from Office
Understanding of Spark core concepts like RDDs, DataFrames, DataSets, SparkSQL and Spark Streaming. Experience with Spark optimization techniques. Deep knowledge of Delta Lake features like time travel, schema evolution, data partitioning. Ability to design and implement data pipelines using Spark and Delta Lake as the data storage layer. Proficiency in Python/Scala/Java for Spark development and integrate with ETL process. Knowledge of data ingestion techniques from various sources (flat files, CSV, API, database) Understanding of data quality best practices and data validation techniques. Other Skills: Understanding of data warehouse concepts, data modelling techniques. Expertise in Git for code management. Familiarity with CI/CD pipelines and containerization technologies. Nice to have experience using data integration tools like DataStage/Prophecy/Informatica/Ab Initio"
Posted 1 month ago
1.0 - 5.0 years
5 - 9 Lacs
Pune
Work from Office
: Job Title- Operations Engineer, Associate Location- Pune, India Role Description Responsible for the day-to-day maintenance of the application systems in operation, including tasks related to identifying and troubleshooting application issues and issues resolution or escalation. Responsibilities also include root cause analysis, management communication and client relationship management in partnership with Infrastructure Service Support team members. Ensures all production changes are made in accordance with life-cycle methodology and risk guidelines. Responsible for coaching and mentoring less experienced team members and or acting as a subject matter expert. In depth Functional knowledge of the application(s) supported and interdependencies Is an experienced and detail-oriented person capable of integrating product knowledge, research and testing to answer complex questions about product behavior and provide end to end solution to permanently fix the issue. The engineer will assist customer teams and other team members to understand how customers can achieve desired outcomes using the applications it exists today. The output of could range from FAQs and knowledge base articles that describe to customers how to operate the product to achieve selected outcomes to end to end coding solution for the issue reported. The engineer would be liaising with the global stakeholders and vendors to deliver technology solutions as part of yearly book of work The engineer should also be able understand functional requirements / expectations of the various stakeholders and work towards an appropriate plan of action. The role also requires working with the product vendors and lead upgrades as applicable. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Researching, designing, implementing and managing software programs Testing and evaluating new programs Identifying areas for modification in existing programs and subsequently developing these modifications Oversee resolution of technical issues coming from customer teams Fix and deliver the customer issues Follow ITIL processes including incident management, change management, release management, problem management and knowledge management Strong problem solving skills with good communication skills, ability to work under pressure with a high sense of urgency. Proactively identify potential incidents and problems as well as availability issues Manage any IT Security incidents that may occur in the application. Identify risk & issues and contribute in Service Management related audits. Perform environment maintenance and management Deploying software tools, processes and metrics Perform standard recurring activities like data and environment refreshes Be a liaison between the customer-facing teams and the Product and Engineering org for management and resolution of all technical questions and issues Work closely with other developers, business and systems analysts Maintain detailed documentation ranging from Knowledge Base articles to live logging of incidents for post-mortems Ensure delivery timelines and SLA obligations established with with internal and external stakeholders are observed and met; escalate as necessary using judgment and discretion Develop a deep understanding of the application platform across all product lines and clearly articulate support decisions and findings Work closely with internal teams to stay up to date on product features, changes, and issues Your skills and experience Must be having total 6+ years of experience and at least 5 years in software development/support engineering Must have advanced knowledge of Java / C# / .Net debugging & scripting (Power shell / Unix / any other) Must have advanced knowledge of MS SQL Sever, SSIS, Tableau and ETL processes Working Knowledge of SDLC & Agile processes Demonstrable experience in leading projects to successful conclusions Strong customer focus with experience of working with cross-functional/ cross-department teams A self-starter with strong organization skills, resolution management, and superior written and verbal communication skills Educational/Qualifications: B.E. / B. Tech. / Master's degree in computer science or equivalent ITIL Certification is good to have How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 1 month ago
4.0 - 9.0 years
10 - 16 Lacs
Hyderabad, Gurugram
Work from Office
JOB RESPONSIBILITIES Below are key role and responsibilities for this position: Interact with L1 and keep run-books and Standard support operational procedures(SOP) up-to-date Perform Ad-hoc support tasks and prepare reports for business Troubleshoot all the new incidents/issues for which knowledge base is not available and escalated by L1 team Resolve the incidents/issues as per agreed SLA Address the service requests as per agreed SLA Follow the escalation process to escalate to L3 team or next level of support as per escalation metric if unable to resolve the issue within agreed time window Prepare the Incident post-mortem / RCA(Root cause analysis) report of incidents and share it with all the stakeholders within agreed timeline Perform the shift handover activities as per agreed SOPs Participate in problem management, change management, knowledge management, even management etc. Update knowledge base with new learnings, changes in resolution steps etc. in timely manner Ensure SLA/KPI (under control of L2 team like incident response time are met and collect/update data of it in required tools Help shift/team lead to prepare to various operational reports required by internal and external stakeholders Contribute to support reusable assets and internal knowledge sharing sessions Work to build-up skills for L3 support level Work on POC of different solutions, tools etc. POC Coordinate with L1 and L3 engineers (as per case requirement) on various issues, incidents, service requests, user queries, various changes in environment, various events etc. Work on automation of manual activities if possible and share the details of automation opportunities with L3 team Personal Attributes: Systematic problem-solving approach, coupled with effective communication skills and a sense of drive Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution Should be flexible to work in all shifts Ability to prioritize when under pressure SKILL REQUIREMENTS Must Skills Experience supporting and troubleshooting SFMC components Hands-on experience with Salesforce MDM for managing enterprise mobile devices. Ability to support device provisioning, policy enforcement, and remote troubleshooting . Knowledge of certificate-based authentication , secure app deployment , and clipboard/data protection policies . Digital Data Exchange (DDX): Experience with data integration and synchronization between SFMC, CRM, and third-party systems using DDX or similar frameworks. Ability to troubleshoot data flow issues , ensure data integrity , and support ETL processes . Familiarity with real-time data exchange protocols , API-based integrations , and compliance with data privacy regulations . Familiarity and working experience on ecommerce projects Working knowledge with ServiceNOW ITSM tool Knowledge of Production Support processes and procedures. Have ability to demonstrate functional and technical architecture knowledge and correlate between the two from past experiences Have good exposure of ITIL Processes like Incident Management, Problem Management, and Knowledge Management etc. Nice to have Skills Good understanding of Accessibility and comfortable using dev tool bar for debugging. Have some exposure of cloud technologies Have understanding of how cloud infrastructure is setup, applications are deployed, various services are setup and used etc. Have ability to understand the technical errors in the application log and understand the solutions provided by L3/Development teams at least at high level
Posted 1 month ago
3.0 - 8.0 years
4 - 8 Lacs
Mumbai, New Delhi, Bengaluru
Work from Office
Years : 3+ Notice Period: Immediate Joiners Job description We are seeking a highly skilled and experienced Informatica Developer to join our team. The ideal candidate will have a strong background in data integration, ETL processes, and data warehousing, with at least 3 years of hands-on experience in Informatica development. Key Responsibilities: Design and Development: Develop, implement, and maintain ETL processes using Informatica PowerCenter and other Informatica tools. Data Integration: Integrate data from various sources, ensuring data quality and consistency. Performance Tuning: Optimize ETL processes for performance and scalability. Collaboration: Work closely with business analysts, data architects, and other stakeholders to understand data requirements and deliver solutions. Documentation: Create and maintain technical documentation for ETL processes and data flows. Support and Maintenance: Provide ongoing support and maintenance for ETL processes, including troubleshooting and resolving issues. Mentorship: Mentor junior developers and provide technical guidance to the team. Technical Skills: Proficiency in Informatica PowerCenter, Informatica Cloud, and other Informatica/ETL tools. Strong SQL ,impala, hive and PL/SQL skills. Experience with data warehousing concepts and BI tools. Knowledge of Unix/Linux Knowledge of Python Big Data Frameworks: Proficiency in Sqoop, Spark, Hadoop, Hive, and Impala Programming: Strong coding skills in Python (including PySpark) , Airflow Location : - Remote
Posted 1 month ago
6.0 - 8.0 years
10 - 20 Lacs
Bengaluru
Work from Office
Job location Bangalore Job Title: Module Lead - SnowFlake Experience 6-8 Years Job Description Sr Snowflake Developer Design and develop our Snowflake data platform, including data pipeline building, data transformation, and access management. Minimum 4+years of experience in Snowflake, strong in SQL Develop data warehouse and data mart solutions for business teams. Accountable to design robust, scalable database and data extraction, transformation, and loading (ETL) solutions Understand and evaluate business requirements that impact the Caterpillar enterprise. Liaises with data creators to support project planning, training, guidance on standards, and the efficient creation/maintenance of high-quality data. Contributes to policies, procedures, and standards as well as technical requirements. Ensure compliance with the latest data standards supported by the company, and brand, legal, information security (data security and privacy compliance). Document data models for domains to be deployed including a logical data model, candidate source lists, and canonical formats. Creates, updates, and enhances metadata policies, processes, and catalogs. Good communication and interacting with SME from client Should have capability to lead a team of 4-5 members Snowflake Certifications Mandatory
Posted 1 month ago
4.0 - 9.0 years
8 - 12 Lacs
Hyderabad, Pune
Work from Office
Sr MuleSoft Developer1 Design and implement MuleSoft solutions using AnyPoint Studio, Mule ESB, and other related technologies.Collaborate with cross-functional teams to gather requirements and deliver high-quality ETL processes.Develop and maintain APIs using RAML and other industry standards.Strong understanding of RAML (REpresentational API Modeling Language) and its usage in API design.Develop complex integrations between various systems, including cloud-based applications such as Snowflake.Ensure seamless data flow by troubleshooting issues and optimizing existing integrations.Provide technical guidance on best practices for data warehousing, ETL development, and PL/SQL programming language.Strong understanding of SQL concepts, including database schema design, query optimization, and performance tuning.Proficiency in developing complex ETL processes using various technologies such as Cloud platforms and Data Warehousing tools (Snowflake).Experience working with multiple databases and ability to write efficient PL/SQL code snippets.
Posted 1 month ago
4.0 - 5.0 years
16 - 20 Lacs
Hyderabad, Pune
Work from Office
Salesforce Data Migration and integration Developer1 Job Summary: The Data Migration developer s responsible for executing, and managing data migration projects within Salesforce environments. This role requires expertise in data extraction, transformation, and loading (ETL) processes, with a strong focus on leveraging Informatica tools. The specialist will ensure the accurate, secure, and efficient migration of data while customizing Salesforce to align with business processes and objectives. Required Skills: 4-5+ years of experience in data migration, with a focus on Salesforce applications and handling sensitive data. Proficiency in using ETL tools like Informatica (PowerCenter, Informatica Cloud), Boomi or other tools for data migration is required. Experience with Salesforce data extraction, transformation, data import/export tools, SQL, ETL processes, and data integration methodologies. Expertise in data migration tools and techniques, and familiarity with Salesforce APIs and integration methods. Migrate & integrate data from different platform into Salesforce. Prepare data migration plan and handle kickouts/fallouts. Develop procedures and script for data migration. Ability to ensure data integrity, deduplication, and performance optimization during migration. Experience handling large-scale data migrations with bulk processing. Knowledge of data validation and reconciliation post-migration. Development, implement and optimize stored procedures and functions using TSQL. Perform SQL database partitioning and indexing procedures as required to handle heavy traffic loads. Must have Strong understanding of the Salesforce data model, objects, relationships, and architecture Strong understanding of Salesforce objects (such as accounts, contacts, case, etc), custom objects, fields & restrictions. The ability to create fast and efficient database queries, including joins with several tables . Good knowledge on SQL optimization techniques. Experience in designing, creating and maintaining databases Familiarity with MuleSoft , Boomi or similar integration platforms, and experience automating processes within Salesforce. Preferred Qualifications: Salesforce Certified Administrator, Salesforce Certified Platform Developer I or II. Relevant certifications in data management, migration, or related areas.
Posted 1 month ago
5.0 - 6.0 years
6 - 10 Lacs
Hyderabad, Pune
Work from Office
Sr Workato Developer2 Required: Experience5-6 years of experience in designing, developing, and deploying integration solutions using Workato.Technical ProficiencyStrong hands-on experience with Workato, including creating recipes, managing triggers, actions, and jobs. Expertise in API integrations and working with cloud applications (Salesforce, NetSuite, ServiceNow, etc.).Integration ExperienceProficiency in integrating SaaS and on-premise applications. Hands-on experience with RESTful and SOAP APIs, webhooks, and other integration protocols.Programming Skills: Solid experience with scripting languages (e.g., JavaScript, Python) to handle data transformations, custom actions, and other logic.Cloud PlatformsExperience working with cloud platforms like AWS, Azure, or Google Cloud.Troubleshooting & DebuggingStrong problem-solving skills to identify, troubleshoot, and resolve complex integration issues.Project ManagementAbility to manage multiple projects, prioritize tasks, and meet deadlines in a fast-paced environment.CommunicationStrong communication skills, with the ability to articulate technical concepts clearly to both technical and non-technical stakeholders. Preferred: CertificationsWorkato Certification or other relevant integration certifications.Experience with ETL toolsKnowledge of ETL processes and tools like Talend, Informatica, or similar is a plus.Agile ExperienceFamiliarity with Agile methodologies and working in Scrum teams.Database KnowledgeExperience with databases such as MySQL, SQL Server, or PostgreSQL.Advanced Skills: Experience with advanced features in Workato, such as advanced mappings, custom connectors, and error handling.
Posted 1 month ago
3.0 - 4.0 years
9 - 14 Lacs
Pune
Work from Office
Salesforce Data Migration and integration Developer4 Job Summary: The Data Migration developer s responsible for executing, and managing data migration projects within Salesforce environments. This role requires expertise in data extraction, transformation, and loading (ETL) processes, with a strong focus on leveraging Informatica tools. The specialist will ensure the accurate, secure, and efficient migration of data while customizing Salesforce to align with business processes and objectives. Required Skills: 3-4+ years of experience in database migration, with a focus on Salesforce applications and handling sensitive data.Proficiency in using ETL tools like Informatica (PowerCenter, Informatica Cloud), Boomi or other tools for data migration is required.Experience with Salesforce data import/export tools, SQL, ETL processes, and data integration methodologies.Expertise in data migration tools and techniques, and familiarity with Salesforce APIs and integration methods. Migrate & integrate data from different platform into Salesforce. Prepare data migration plan and handle kickouts/fallouts. Develop procedures and script for data migration. Development, implement and optimize stored procedures and functions using TSQL. Perform SQL database partitioning and indexing procedures as required to handle heavy traffic loads. Must have Understanding of Salesforce architecture. Strong understanding of Salesforce objects (such as accounts, contacts, case, etc),custom objects, fields & restrictions. Hands on experience in data migration & integration from different platform into Salesforce. The ability to create fast and efficient database queries, including joins with several tables. Good knowledge on SQL optimization techniques. Experience in designing, creating and maintaining databasesFamiliarity with MuleSoft , Boomi or similar integration platforms, and experience automating processes within Salesforce. Preferred Qualifications:Salesforce Certified Administrator, Salesforce Certified Platform Developer I or II.Relevant certifications in data management, migration, or related areas.
Posted 1 month ago
4.0 - 8.0 years
7 - 11 Lacs
Pune
Work from Office
SSIS - Sr Developer1 We are seeking a talented and experienced SQL Server Integration Services (SSIS) Developer to join our dynamic team. The ideal candidate will focus primarily on SSIS, SQL Server Data Tools (SSDT), and SQL Server to design, develop, implement, and maintain data integration solutions. Responsibilities Develop, implement, and maintain SSIS packages for data integration, transformation, and migration tasks. Collaborate with cross-functional teams to gather requirements and design efficient data integration solutions. Optimize and tune SSIS packages for performance and scalability. Troubleshoot and debug SSIS packages to identify and resolve issues. Develop and maintain documentation related to SSIS packages, data mappings, and data flows. Work closely with database administrators and other stakeholders to ensure data integrity, security, and compliance. Stay updated on industry best practices and emerging technologies related to SSIS, SSDT, and SQL Server. Bachelor's degree in Computer Science, Information Technology, or related field. [Specify Number] years of experience working with SQL Server Integration Services (SSIS) and SQL Server Data Tools (SSDT). Strong proficiency in SQL Server and T-SQL scripting. Experience in designing and implementing data integration solutions using SSIS. Solid understanding of data warehousing concepts and ETL processes. Familiarity with Data Modeling. Familiarity with database administration tasks such as database design, performance tuning, and troubleshooting. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Preferred Qualifications Microsoft certifications related to SQL Server and SSIS. Experience with other data integration tools and technologies. Knowledge of data visualization tools such as Power BI or Tableau. Benefits Competitive salary Health insurance Retirement savings plan Paid time off and holidays Career development opportunities Join our team and play a key role in driving our data integration initiatives forward!
Posted 1 month ago
3.0 - 7.0 years
5 - 9 Lacs
Kolkata, Hyderabad, Pune
Work from Office
ETL QA tester1 Job Tile ETL QA tester Job Summary: We are looking for an experienced ETL Tester to ensure the quality and integrity of our data processing and reporting systems. The ideal candidate will have a strong background in ETL processes, data warehousing, and experience with Snowflake and Tableau. This role involves designing and executing test plans, identifying and resolving data quality issues, and collaborating with development teams to enhance data processing systems. Key Responsibilities: Design, develop, and execute comprehensive test plans and test cases for ETL processes. Validate data transformation, extraction, and loading processes to ensure accuracy and integrity. Perform data validation and data quality checks using Snowflake and Tableau. Identify, document, and track defects and data quality issues. Collaborate with developers, business analysts, and stakeholders to understand requirements and provide feedback on data-related issues. Create and maintain test data, test scripts, and test environments. Generate and analyze reports using Tableau to validate data accuracy and completeness. Conduct performance testing and optimization of ETL processes. Develop and maintain automated testing scripts and frameworks for ETL testing. Ensure compliance with data governance and security standards. Location - Pune,Hyderabad,Kolkata,Chandigarh
Posted 1 month ago
15.0 - 20.0 years
9 - 13 Lacs
Pune
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to enhance the overall data architecture and platform functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Engage in continuous learning to stay updated with industry trends and technologies.- Assist in the documentation of data platform processes and best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with cloud data services and data warehousing solutions.- Strong understanding of data integration techniques and ETL processes.- Familiarity with data modeling concepts and practices.- Experience in working with big data technologies and frameworks. Additional Information:- The candidate should have minimum 2 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
3.0 - 8.0 years
4 - 8 Lacs
Coimbatore
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Microsoft SQL Server, Google Cloud Data ServicesMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. You will create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and maintain data pipelines.- Ensure data quality throughout the data lifecycle.- Implement ETL processes for data migration and deployment.- Collaborate with cross-functional teams to understand data requirements.- Optimize data storage and retrieval processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery.- Strong understanding of data engineering principles.- Experience with cloud-based data services.- Knowledge of SQL and database management systems.- Hands-on experience with data modeling and schema design. Additional Information:- The candidate should have a minimum of 3 years of experience in Google BigQuery.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France