Jobs
Interviews

3301 Big Data Jobs - Page 6

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 8.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Scala, PySparkMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders to drive project success. You will also engage in problem-solving activities, providing guidance and support to your team while ensuring that best practices are followed throughout the development process. Your role will be pivotal in shaping the direction of application projects and ensuring that they meet the needs of the organization and its clients. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Facilitate workshops and meetings to gather requirements and feedback from stakeholders. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with PySpark, Scala.- Strong understanding of data engineering principles and practices.- Experience with cloud-based data solutions and architectures.- Familiarity with data governance and compliance standards. Additional Information:- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 5 days ago

Apply

5.0 - 8.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Palantir Foundry Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Project Role:Lead Data Engineer Project Role Description:Design, build and enhance applications to meet business process and requirements in Palantir foundry.Work experience:Minimum 6 years Must have Skills: Palantir Foundry, PySparkGood to Have Skills: Experience in PySpark, python and SQLKnowledge on Big Data tools & TechnologiesOrganizational and project management experience.Job Requirements & Key Responsibilities:Responsible for designing, developing, testing, and supporting data pipelines and applications on Palantir foundry.Configure and customize Workshop to design and implement workflows and ontologies.Collaborate with data engineers and stakeholders to ensure successful deployment and operation of Palantir foundry applications.Work with stakeholders including the product owner, data, and design teams to assist with data-related technical issues and understand the requirements and design the data pipeline.Work independently, troubleshoot issues and optimize performance.Communicate design processes, ideas, and solutions clearly and effectively to team and client. Assist junior team members in improving efficiency and productivity.Technical Experience:Proficiency in PySpark, Python and SQL with demonstrable ability to write & optimize SQL and spark jobs.Hands on experience on Palantir foundry related services like Data Connection, Code repository, Contour, Data lineage & Health checks.Good to have working experience with workshop, ontology, slate.Hands-on experience in data engineering and building data pipelines (Code/No Code) for ELT/ETL data migration, data refinement and data quality checks on Palantir Foundry.Experience in ingesting data from different external source systems using data connections and sync.Good Knowledge on Spark Architecture and hands on experience on performance tuning & code optimization.Proficient in managing both structured and unstructured data, with expertise in handling various file formats such as CSV, JSON, Parquet, and ORC.Experience in developing and managing scalable architecture & managing large data sets.Good understanding of data loading mechanism and adeptly implement strategies for capturing CDC.Nice to have test driven development and CI/CD workflows.Experience in version control software such as Git and working with major hosting services (e. g. Azure DevOps, GitHub, Bitbucket, Gitlab).Implementing code best practices involves adhering to guidelines that enhance code readability, maintainability, and overall quality.Educational Qualification:15 years of full-term education Qualification 15 years full time education

Posted 5 days ago

Apply

2.0 - 4.0 years

4 - 8 Lacs

Pune

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs, while also troubleshooting any issues that arise in the data flow. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve data processes to enhance efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with data modeling and database design.- Strong understanding of ETL processes and data integration techniques.- Familiarity with cloud platforms and data storage solutions.- Experience in performance tuning and optimization of data queries. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education

Posted 5 days ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Oracle Procedural Language Extensions to SQL (PLSQL) Good to have skills : Google Cloud Platform ArchitectureMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs. Additionally, you will monitor and optimize data workflows to enhance performance and reliability, ensuring that data is accessible and usable for stakeholders. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and maintain robust data pipelines to support data processing and analytics.- Collaborate with data architects and analysts to design data models that meet business requirements. Professional & Technical Skills: - Must To Have Skills: Proficiency in Oracle Procedural Language Extensions to SQL (PLSQL).- Good To Have Skills: Experience with Google BigQuery, Google Cloud Platform Architecture.- Strong understanding of ETL processes and data integration techniques.- Experience with data quality assurance and data governance practices.- Familiarity with data warehousing concepts and technologies. Additional Information:- The candidate should have minimum 3 years of experience in Oracle Procedural Language Extensions to SQL (PLSQL).- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 5 days ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Oracle Procedural Language Extensions to SQL (PLSQL) Good to have skills : Google BigQuery, Google Cloud Platform ArchitectureMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will engage in the design, development, and maintenance of data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and contribute to the overall data strategy of the organization, ensuring that data solutions are robust, scalable, and aligned with business objectives. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and optimize data pipelines to ensure efficient data flow and processing.- Monitor and troubleshoot data quality issues, implementing corrective actions as necessary. Professional & Technical Skills: - Must To Have Skills: Proficiency in Oracle Procedural Language Extensions to SQL (PLSQL).- Good To Have Skills: Experience with Google BigQuery, Google Cloud Platform Architecture.- Strong understanding of ETL processes and data integration techniques.- Experience with data modeling and database design principles.- Familiarity with data warehousing concepts and best practices. Additional Information:- The candidate should have minimum 3 years of experience in Oracle Procedural Language Extensions to SQL (PLSQL).- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 5 days ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Oracle Procedural Language Extensions to SQL (PLSQL) Good to have skills : Google BigQuery, Google Cloud Platform ArchitectureMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs. Additionally, you will monitor and optimize data workflows to enhance performance and reliability, ensuring that data is accessible and actionable for stakeholders. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and maintain robust data pipelines to support data processing and analytics.- Collaborate with data architects and analysts to design data models that meet business requirements. Professional & Technical Skills: - Must To Have Skills: Proficiency in Oracle Procedural Language Extensions to SQL (PLSQL).- Good To Have Skills: Experience with Google BigQuery, Google Cloud Platform Architecture.- Strong understanding of ETL processes and data integration techniques.- Experience with data quality assurance and data governance practices.- Familiarity with data warehousing concepts and technologies. Additional Information:- The candidate should have minimum 3 years of experience in Oracle Procedural Language Extensions to SQL (PLSQL).- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 5 days ago

Apply

5.0 - 8.0 years

10 - 14 Lacs

Coimbatore

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : PySpark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team in implementing effective solutions. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the development process. Your role will be pivotal in fostering a collaborative environment that encourages innovation and efficiency in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Strong understanding of data processing frameworks and distributed computing.- Experience with data integration and ETL processes.- Familiarity with cloud platforms and services related to data processing.- Ability to troubleshoot and optimize performance of applications. Additional Information:- The candidate should have minimum 5 years of experience in PySpark.- This position is based in Coimbatore.- A 15 years full time education is required. Candidate should be ready to work in rotational shift Qualification 15 years full time education

Posted 5 days ago

Apply

2.0 - 4.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Data Engineering Good to have skills : Oracle Procedural Language Extensions to SQL (PLSQL), Google BigQuery, Google Cloud Platform ArchitectureMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. A typical day involves creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and optimize data workflows, ensuring that the data infrastructure supports the organization's analytical needs effectively. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Engineering.- Good To Have Skills: Experience with Oracle Procedural Language Extensions to SQL (PLSQL), Google BigQuery, Google Cloud Platform Architecture.- Strong understanding of data modeling and database design principles.- Experience with data warehousing solutions and data lake architectures.- Familiarity with data integration tools and ETL frameworks. Additional Information:- The candidate should have minimum 5 years of experience in Data Engineering.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 5 days ago

Apply

2.0 - 4.0 years

4 - 8 Lacs

Chennai

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : SAS Base & Macros Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve data processes to enhance efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAS Base & Macros.- Good To Have Skills: Experience with data visualization tools.- Strong understanding of data warehousing concepts and practices.- Experience in developing and maintaining ETL processes.- Familiarity with data quality frameworks and best practices. Additional Information:- The candidate should have minimum 5 years of experience in SAS Base & Macros.- This position is based in Chennai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 5 days ago

Apply

2.0 - 4.0 years

4 - 8 Lacs

Pune

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Data Analysis & Interpretation Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems, contributing to the overall efficiency and reliability of data management within the organization. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to understand data requirements and deliver effective solutions.- Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Analysis & Interpretation.- Strong understanding of data modeling and database design principles.- Experience with ETL tools and data integration techniques.- Familiarity with data visualization tools to present findings effectively.- Knowledge of programming languages such as Python or SQL for data manipulation. Additional Information:- The candidate should have minimum 2 years of experience in Data Analysis & Interpretation.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 5 days ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Noida

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Microsoft SQL Server, Google Cloud Data ServicesMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. You will create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and maintain data pipelines.- Ensure data quality throughout the data lifecycle.- Implement ETL processes for data migration and deployment.- Collaborate with cross-functional teams to understand data requirements.- Optimize data storage and retrieval processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery.- Strong understanding of data engineering principles.- Experience with cloud-based data services.- Knowledge of SQL and database management systems.- Hands-on experience with data modeling and schema design. Additional Information:- The candidate should have a minimum of 3 years of experience in Google BigQuery.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 5 days ago

Apply

5.0 - 8.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Data Engineering Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team in implementing effective solutions. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the development process. Your role will be pivotal in driving innovation and efficiency within the application development lifecycle, fostering a collaborative environment that encourages team growth and success. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Engineering.- Strong understanding of data pipeline architecture and ETL processes.- Experience with cloud platforms such as AWS, Azure, or Google Cloud.- Familiarity with data warehousing solutions and big data technologies.- Ability to work with various data storage solutions, including SQL and NoSQL databases. Additional Information:- The candidate should have minimum 5 years of experience in Data Engineering.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 5 days ago

Apply

3.0 - 8.0 years

13 - 18 Lacs

Pune

Work from Office

Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : MongoDB Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the data architecture aligns with the overall business objectives and technical specifications. You will collaborate with various stakeholders to gather requirements and translate them into effective data solutions, while also addressing any challenges that arise during the development process. Your role will be pivotal in ensuring that the data architecture is robust, scalable, and efficient, contributing to the overall success of the application and the organization. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to gather and analyze data requirements.- Design and implement data models that support business processes and analytics. Professional & Technical Skills: - Must To Have Skills: Proficiency in MongoDB.- Strong understanding of data modeling concepts and best practices.- Experience with data integration tools and techniques.- Familiarity with cloud-based data storage solutions.- Knowledge of data governance and data quality principles. Additional Information:- The candidate should have minimum 3 years of experience in MongoDB.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education

Posted 5 days ago

Apply

7.0 - 11.0 years

13 - 18 Lacs

Pune

Work from Office

Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : Apache Kafka Good to have skills : Data AnalyticsMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the architecture aligns with business needs and technical specifications. You will collaborate with various teams to ensure that data flows seamlessly across systems, contributing to the overall efficiency and effectiveness of data management within the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Develop and maintain documentation related to data architecture and design. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Kafka.- Good To Have Skills: Experience with Data Analytics.- Strong understanding of data modeling techniques and best practices.- Experience with data integration tools and methodologies.- Familiarity with cloud-based data storage solutions. Additional Information:- The candidate should have minimum 5 years of experience in Apache Kafka.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 5 days ago

Apply

5.0 - 10.0 years

0 - 2 Lacs

Hyderabad

Hybrid

Role & responsibilities 5+ Years of Extensive Experience working with multiple Databases, ETL and BI testing Working experience in Investment Management and Capital Markets Domain with is preferred Experience working in the applications like Eagle, Calypso, Murex will be added advantage Experience in delivering large releases to the customer through direct and partner teams. Experience in testing data validation scenarios and Data ingestion, pipelines, and transformation processes. Experience in Vertica, DataStage, Teradata, and Big Data environments for both data Ingestion and Consumption. Extensive knowledge on any Business Intelligence tool, Preferably MicroStrategy and Tableau. Extensive experience in writing and troubleshooting complex SQL Queries. Expert in providing QA solutions based on Data Warehousing and Dimensional Modelling design. Expert in drafting ETL Source to Target Mapping document design. Identify data validation tools that will suit the ETL project conditions. Ensure all sign offs on deliverables (overall test strategy, test plan, test cases and test results) and that testing meets governance requirements. Establishing and driving Automation Capabilities. Collaborating with dev & architect teams to identify and prioritize opportunities for automation. Experience in ETL automation with open-source tools, Service Virtualization, CI/CD.

Posted 5 days ago

Apply

12.0 - 17.0 years

12 - 17 Lacs

Pune

Work from Office

Role Overview: The Technical Architect specializing in Data Governance and Master Data Management (MDM) designs, implements, and optimizes enterprise data solutions. The jobholder has expertise in tools like Collibra, Informatica, InfoSphere, Reltio, and other MDM platforms, ensuring data quality, compliance, and governance across the organization. Responsibilities: Architect and optimize strategies for data quality, metadata management, and data stewardship. Design and implement data governance frameworks and MDM solutions using tools like Collibra, Informatica, InfoSphere, and Reltio. Develop strategies for data quality, metadata management, and data stewardship. Collaborate with cross-functional teams to integrate MDM solutions with existing systems. Establish best practices for data governance, security, and compliance. Monitor and troubleshoot MDM environments for performance and reliability. Provide technical leadership and guidance to data teams. Stay updated on advancements in data governance and MDM technologies. Key Technical Skills & Responsibilities Overall 12+Yrs of Experience 10+ years of experience working on DG/MDM projects Strong on Data Governance concepts Hands-on different DG tools/services Hands-on Reference Data, Taxonomy Strong understanding of Data Governance, Data Quality, Data Profiling, Data Standards, Regulations, Security Match and Merge strategy Design and implement the MDM Architecture and Data Models Usage of Spark capabilities Statistics to deduce meanings from vast enterprise level data Different data visualization means of analyzing huge data sets Good to have knowledge of Python/R/Scala languages Experience on DG on-premise and on-cloud Understanding of MDM, Customer, Product, Vendor Domains and related artifacts Experience of working on proposals, customer workshops, assessments etc is preferred Must have good communication and presentation skills Technology Stack - Collibra, IBM MDM, Reltio, Infosphere Eligibility Criteria: Bachelors degree in Computer Science, Data Management, or a related field. Proven experience as a Technical Architect in Data Governance and MDM. Certifications in relevant MDM tools (e.g., Collibra Data Governance, Informatica / InfoSphere / Reltio MDM, ). Experience with cloud platforms like AWS, Azure, or GCP. Proficiency in tools like Collibra, Informatica, InfoSphere, Reltio, and similar platforms. Strong understanding of data modeling, ETL/ELT processes, and cloud integration. Excellent problem-solving and communication skills.

Posted 5 days ago

Apply

12.0 - 16.0 years

0 - 0 Lacs

Pune

Hybrid

So, what’s the role all about? At Actimize, we are developing the next generation cloud platform that will host solutions in the space of Anti-Money Laundering and Fraud Prevetion. As part of the R&D team, you will be responsible for architecting and developing the platform that is robust, fault-tolerant, available, scalable, secure and performant. How will you make an impact? NICE Actimize is the largest and broadest provider of financial crime, risk and compliance solutions for regional and global financial institutions & has been consistently ranked as number one in the space At NICE Actimize, we recognize that every employee’s contributions are integral to our company’s growth and success. To find and acquire the best and brightest talent around the globe, we offer a challenging work environment, competitive compensation, and benefits, and rewarding career opportunities. Come share, grow and learn with us – you’ll be challenged, you’ll have fun and you’ll be part of a fast growing, highly respected organization. This new SaaS platform will enable our customers (some of the biggest financial institutes around the world) to create solutions on the platform to fight financial crime. Have you got what it takes? The Software Architect is responsible for providing technical leadership across development teams in one or more functional areas. This position is ultimately responsible for the successful implementation of key deliverables to ensure that each release is designed with high availability, scalability, serviceability and supportability in mind. You will be responsible for the architecture of our product platform end to end. Reporting to the Director of Engineering and working closely with all the engineering functions, chief architect, architecture review board and the scrum teams. You will drive an open and extendible architecture, ensuring relevant technology solutions are being adopted, good engineering practices are implemented and the overall system design and architecture is kept while we develop new features to the market. You will have a key role in fostering innovation and ensuring adoption of new technologies as needed. We are looking for someone who is passionate about delivering high quality enterprise cloud products that is are used by millions of users. You are expected to be very hands on and have an in-depth technical understanding of cloud software architecture Main Responsibilities and Deliverables: Set the end-to-end technical direction for the team, including platform, technology, tools, and infrastructure. Communicate architecture in an effective and professional manor. Drive technical decisions, solve tough problems, and coordinate multiple project execution. Ensure that the Development coding quality standards and review processes are followed to ensure proper security and high availability standards. Monitoring the quality of code that is delivered by your team through reviews and other processes. Foster strong team work environment and create passion and energy within the team. Be an active participant in the development leadership team ensuring corporate success. Represents self and department with professionalism and competence. Follow the company Code of Ethics and policies and procedures at all times Overall responsibility for the platform architecture, establish well architected and designed solution. Working with several scrum teams and actively involved in the design of multiple features in parallel. Mentor and develop architects, engineers and Involvement in hiring great engineers. Participate in envisioning of next gen plans to achieve longer term strategic objectives of the organization. Drive the architecture of a project/product line, including authoring functional and design specifications, scalability, security, data flow, and interface. Contribution to the strategic vision of Guardian Analytics business unit. Identify impact of new technologies on our products and communicate to development teams. Partner with customers/prospects on product functionality and future direction; participate in industry briefings. Consider customer impact; when considering alternatives, evaluate whether something we are asking the customer to do would be reasonable. Attitude of quality, diligence and thoroughness. Conduct architectural and design reviews. Conduct code reviews Evaluate new technologies/innovations and industry trends. Intentionally drives business value through architectural innovation Qualifications: Bachelor’s or master’s degree in computer science or a related field. 10-12 years of software development experience About 4 years experience as an architect for Cloud-SaaS-multi tenant applications . Development experience with APIs, Integrations, Middleware technologies, Web development technologies and Spring family of frameworks. Working Knowledge of design patterns, methodologies and architectural styles Experience in database design , Data APIs, Batches/Big Data Experience designing multi-tiered service-oriented applications and microservices ,REST Experience with AI/ML development or usage will be a big plus Experience with NoSQL DBs will be an advantage. Development experience of securing all tiers of the application. Exposure to Performance Engineering, Cloud SLA , Availability, resiliency will be a plus Technically Savvy , Influencer and Learner Experience using Test Driven Development, Continuous Integration, and Test Automation Experience of driving improved Developer Experience using technology and tools What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NiCE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NiCEr! Enjoy NiCE-FLEX! At NiCE, we work according to the NiCE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID: 8091 Reporting into: Director Role Type: Tech Manager

Posted 5 days ago

Apply

8.0 - 10.0 years

30 - 32 Lacs

Hyderabad

Work from Office

Candidate Specifications: Candidate should have 9+ years of experience. Candidates should have 9+ years of experience in Python and Pyspark Candidate should have strong experience in AWS and PLSQL. Candidates should be strong in Data management with data governance and data streaming along with data lakes and data-warehouse Candidates should also have exposure in Team handling and stakeholder management skills. Candidate should have excellent in written and verbal communication skills. Contact Person: Sheena Rakesh

Posted 5 days ago

Apply

18.0 - 22.0 years

0 Lacs

chennai, tamil nadu

On-site

As the General Manager/Director of Engineering for India Operations at our organization, your primary responsibility will be overseeing the product engineering, quality assurance, and product support functions. You will play a crucial role in ensuring the delivery of high-quality software products that meet SLAs, are delivered on time, and are within budget. Collaborating with the CTO and other team members, you will help develop a long-term product plan for client products and manage the release planning cycles for all products. A key aspect of your role will involve resource management and ensuring that each product team has the necessary skilled resources to meet deliverables. You will also be responsible for developing and managing a skills escalation and promotion path for the product engineering organization, as well as implementing tools and processes to optimize product engineering throughput and quality. Key Result Areas (KRAs) for this role include working effectively across multiple levels in the organization and in a global setting, ensuring key milestones are met, delivering high-quality solutions, meeting project timelines and SLAs, maintaining customer satisfaction, ensuring controlled releases to production, and aligning personnel with tasks effectively. Additionally, you will need to have a deep understanding of our products, their interrelationships, and relevance to the business to ensure their availability and stability. To qualify for this role, you should have a Bachelor's degree in Computer Science/Engineering from premier institutes, with an MBA being preferred. You should have at least 18 years of software development experience, including 10+ years in a managerial capacity. Strong knowledge of the software development process, hands-on implementation experience, leadership experience in an early-stage start-up, familiarity with mobile technologies, and professional experience with interactive languages and technologies such as FLEX, PHP, HTML5, MYSQL, and MONGODB are desired. Experience with Agile Methodology and on-site experience working in the US would be advantageous. In summary, as the General Manager/Director of Engineering for India Operations, you will be instrumental in driving the success of our product engineering efforts, ensuring high-quality deliverables, and optimizing processes to meet business objectives effectively. If you are interested in this exciting opportunity, please reach out to us at jobs@augustainfotech.com. (Note: This Job Description is a standard summary and should be written in second person format without any headers),

Posted 6 days ago

Apply

2.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

The Applications Development Intermediate Programmer Analyst role at our organization involves actively participating in the establishment and implementation of new or revised application systems and programs in collaboration with the Technology team. Your primary objective will be to contribute to applications systems analysis and programming activities. You will be expected to utilize your knowledge of applications development procedures and concepts, along with a basic understanding of other technical areas, to identify and define necessary system enhancements. This includes leveraging script tools, analyzing, and interpreting code. Additionally, you will consult with users, clients, and other technology groups on issues, recommend programming solutions, and provide installation and support for customer exposure systems. As an Intermediate Programmer Analyst, you will apply your fundamental knowledge of programming languages to create design specifications and analyze applications to detect vulnerabilities and security issues. Testing and debugging will also be part of your responsibilities. Furthermore, you will serve as an advisor or coach to new or lower-level analysts, identify problems, analyze information, and make evaluative judgments to recommend and implement solutions. In this role, you will need to resolve issues by selecting solutions based on your technical experience, guided by precedents. You should be able to operate with a limited level of direct supervision, exercise independence of judgment and autonomy, and act as a subject matter expert to senior stakeholders and/or other team members. It is essential to appropriately assess risk when making business decisions, with a focus on safeguarding the firm's reputation and ensuring compliance with applicable laws and regulations. This includes adhering to policies, demonstrating ethical judgment in personal behavior and business practices, and reporting control issues transparently. Qualifications: - 4-8 years of relevant experience in Data Analytics or Big Data - Hands-on experience with SQL, Python, and Pyspark, including Spark components - 2-4 years of experience as a Big Data Engineer developing, optimizing, and managing large-scale data processing systems and analytics platforms - 4 years of experience in distributed data processing and near real-time data analytics using PySpark - ETL experience is preferred over Abinitio - Strong understanding of PySpark execution plans, partitioning, and optimization techniques Education: - Bachelor's degree or equivalent experience This is a full-time position within the Technology job family group, specifically in the Applications Development job family. If you possess the necessary skills and experience, we encourage you to apply and become part of our dynamic team.,

Posted 6 days ago

Apply

7.0 - 11.0 years

0 Lacs

pune, maharashtra

On-site

You will be joining Atgeir Solutions, a leading innovator in technology, renowned for its commitment to excellence. As a Technical Lead specializing in Big Data and Cloud technologies, you will have the opportunity for advancement to the role of Technical Architect. Your responsibilities will include leveraging your expertise in Big Data and Cloud technologies to contribute to the design, development, and implementation of complex systems. You will lead and inspire a team of professionals, offering technical guidance and mentorship to foster a collaborative and innovative work environment. In addition, you will be tasked with solving intricate technical challenges and guiding your team in overcoming obstacles in Big Data and Cloud environments. Investing in the growth and development of your team members will be crucial, including identifying training needs, organizing knowledge-sharing sessions, and promoting a culture of continuous learning. Collaboration with stakeholders, such as clients, architects, and other leads, will be essential to understand requirements and align technology strategies with business goals, particularly in the realm of Big Data and Cloud. To qualify for this role, you should hold a Bachelor's or Master's degree in Computer Science, Engineering, or a related field, along with 7-10 years of experience in software development. A proven track record of technical leadership in Big Data and Cloud environments is required. Proficiency in technologies like Hadoop, Spark, GCP, AWS, and Azure is essential, with knowledge of Databricks/Snowflake considered an advantage. Strong communication and interpersonal skills are necessary to convey technical concepts to various stakeholders effectively. Upon successful tenure as a Technical Lead, you will have the opportunity to progress into the role of Technical Architect. This advancement will entail additional responsibilities related to system architecture, design, and strategic technical decision-making, with a continued focus on Big Data and Cloud technologies.,

Posted 6 days ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

As an Associate Data Architect at Quantiphi, you will be part of a dynamic team that thrives on innovation and growth. Your role will involve designing and delivering big data pipelines for structured and unstructured data across diverse geographies, particularly focusing on assisting healthcare organizations in achieving their business objectives through the utilization of data ingestion technologies, cloud services, and DevOps practices. Your responsibilities will include collaborating with cloud engineers and clients to address large-scale data challenges by creating tools for migration, storage, and processing on Google Cloud. You will be instrumental in crafting cloud migration strategies for both cloud-based and on-premise applications, as well as diagnosing and resolving complex issues within distributed systems to enhance efficiency at scale. In this role, you will have the opportunity to design and implement cutting-edge solutions for data storage and computation for various clients. You will work closely with experts from different domains such as Cloud engineering, Software engineering, and ML engineering to develop platforms and applications that align with the evolving trends in the healthcare sector, including digital diagnosis, AI marketplace, and software as a medical product. Effective communication with cross-functional teams, including Infrastructure, Network, Engineering, DevOps, SiteOps, and cloud customers, will be essential to drive successful project outcomes. Additionally, you will play a key role in building advanced automation tools, monitoring solutions, and data operations frameworks across multiple cloud environments to streamline processes and enhance operational efficiency. A strong understanding of data modeling and governance principles will be crucial for this role, enabling you to contribute meaningfully to the development of scalable and sustainable data architectures. If you thrive in a fast-paced environment that values innovation, collaboration, and continuous learning, then a career as an Associate Data Architect at Quantiphi is the perfect fit for you. Join us and be part of a team of dedicated professionals who are passionate about driving positive change through technology and teamwork.,

Posted 6 days ago

Apply

7.0 - 11.0 years

0 Lacs

pune, maharashtra

On-site

You are an experienced professional with over 7 years of experience in application or production support looking to join our Production/Application support team. You possess a blend of strong technical skills in Unix, SQL, and Big Data technologies along with domain expertise in financial services such as securities, secured financing, rates, liquidity reporting, derivatives, front office/back-office systems, and trading lifecycle. Your key responsibilities will include providing L2 production support for mission-critical liquidity reporting and financial applications, ensuring high availability and performance. You will be monitoring and resolving incidents related to trade capture, batch failures, position keeping, market data, pricing, risk, and liquidity reporting. Additionally, you will proactively manage alerts, logs, and jobs using Autosys, Unix tools, and monitoring platforms like ITRS/AWP. In this role, you will be executing advanced SQL queries and scripts for data analysis, validation, and issue resolution. You will also be supporting multiple applications built on stored procedures, SSIS, SSRS, Big Data ecosystems (Hive, Spark, Hadoop), and troubleshooting data pipeline issues. It will be your responsibility to maintain and improve knowledge bases, SOPs, and runbooks for production support while actively participating in change management and release activities, including deployment validations. You will take the lead in root cause analysis (RCA), conduct post-incident reviews, and drive permanent resolutions. Collaboration with infrastructure teams on capacity, performance, and system resilience initiatives will be crucial. Your contribution to continuous service improvement, stability management, and automation initiatives will be highly valued. To qualify for this role, you should have a Bachelor's or Master's degree in Computer Science, Information Technology, Engineering, or a related field. A minimum of 7 years of experience in application or production support, with at least 2 years at an advanced level, is required. Your hands-on experience with Unix/Linux scripting, file manipulation, job control, SQL (MSSQL/Oracle or similar), stored procedures, SSIS, SSRS, Big Data technologies (Hadoop, Hive, Spark), job schedulers like Autosys, and log analysis tools will be essential. Solid understanding of financial instruments and trade lifecycle, knowledge of front office/back office and reporting workflows and operations, excellent analytical and problem-solving skills, effective communication, stakeholder management skills, and experience with ITIL processes are also key requirements for this role. If you meet these qualifications and are looking to join a dynamic team, we encourage you to apply.,

Posted 6 days ago

Apply

2.0 - 4.0 years

25 - 30 Lacs

Pune

Work from Office

Rapid7 is looking for Data Engineer to join our dynamic team and embark on a rewarding career journey Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.

Posted 1 week ago

Apply

1.0 - 4.0 years

25 - 30 Lacs

Thane

Work from Office

Bachelor s or master s degree in computer science, Data Science, Engineering, or a related field. EsyCommerce is seeking a highly experienced Data Engineer to join our growing team in either Mumbai or Pune. This role requires a strong foundation in data engineering principles, coupled with experience in application development and data science techniques. The ideal candidate will be responsible for designing, developing, and maintaining robust data pipelines and applications, as well as leveraging analytical skills to transform data into valuable insights. This position calls for a blend of technical expertise, problem-solving abilities, and effective communication skills to drive data-driven solutions that meet business objectives.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies