Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6 - 10 years
7 - 11 Lacs
Mumbai, Hyderabad, Bengaluru
Work from Office
Job Description Join our innovative team as a Python developer with Java expertise. Develop scalable data-intensive applications, APIs, and scripts ensuring optimized performance. Collaborate with cross-functional teams, participate in code reviews, and contribute to best practices. You will work on complex data-driven projects, leveraging your expertise to design, develop, and deploy high-quality solutions. Our team values open communication, collaborative problem-solving, and continuous learning. Career Level - IC3 Responsibilities - Design, develop, and deploy Python applications and scripts for data processing and analysis- Create RESTful APIs using Java for data integration- Develop and optimise data retrieval and querying mechanisms- Troubleshoot and optimise data query performance and retrieval- Participate in code reviews and contribute to best practices- Debug issues and provide effective solutions- Collaborate with cross-functional teams to identify and prioritise project requirements Qualifications - Bachelor of Technology (B.Tech) or equivalent in Computer Science/Information Technology- 6+ years of overall experience in software development- 3+ years of experience in Python development- 3+ years of experience in Java development- 2+ years of experience with MySQL- Strong oral and written communication skills in English- Excellent problem-solving skills Good to Have: - Experience with Artificial Intelligence (AI) and Machine Learning (ML) concepts- Familiarity with AI/ML frameworks (TensorFlow, PyTorch, Scikit-Learn)- Knowledge of Generative AI / Agents / Chatbots
Posted 1 month ago
6 - 10 years
13 - 17 Lacs
Bengaluru
Work from Office
Job Description Data Integration, OCI Oracle Cloud is a comprehensive enterprise-grade platform that offers best-in-class services across Software as a Service (SaaS), Platform as a Service (PaaS) and Infrastructure as a Service (IaaS). Oracle Cloud platform offers choice and flexibility for customers to build, deploy, integrate, and extend applications in the cloud that enable adapting to rapidly changing business requirements, promote interoperability and avoid lock-in. This platform supports numerous open standards (SQL, HTML5, REST, and more), open-source solutions (such as Kubernetes, Hadoop, Spark and Kafka) and a wide variety of programming languages, databases, tools and integration frameworks. Our Team Oracle Cloud Infrastructure (OCI) is a strategic growth area for Oracle. It is a comprehensive cloud services offering in the enterprise software industry, spanning Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and Software as a Service (SaaS). OCI is currently building a future ready Gen2 cloud data management platform. Oracle Cloud Data integration under pins a comprehensive, best-in-class data integration PaaS offering with hundreds of out-of-the-box connectors to seamlessly integrate on-prem and cloud applications. Your Opportunity We are on a path breaking journey to build the best of breed Data Integration service that is built for hyper scale by leveraging cutting edge technologies (Spark, Scala, Livy, Apache Flink, Airflow, Kubernetes, etc.) and modern design/architecture principles (Micro-services, Scale to zero, Telemetry, Circuit Breakers, etc.) as part of the next gen AI fueled cloud computing platform. You will have the opportunity to be part of a team of passionate engineers who are fueled by serving customers and have a penchant to constantly push the innovation bar We are looking for a strong engineer who thrives on research and development projects. We want you to be a strong technical leader who needs to be hands-on. We want you to work with the development team and can work efficiently with other product groups that can sometimes be remote in different geographies. You should be comfortable working with product management. You should also be comfortable working with senior architects and engineering leaders to make sure we are building the right product and services using the right design principles. Your Qualifications B.E./M.E/PhD (Computer Sc, Electronics or Electrical Engg) 5+ years of experience with at least 2 years in cloud technologies Strong technical understanding in building scalable, high performance distributed services/systems Strong experience with Java open-source and API standards Experience working on Cloud infrastructure APIs, REST API model, and developing REST APIs Strong knowledge on Docker/Kubernetes Deep understanding of data structures, algorithms and excellent problem solving skills Working experience in one or more of the below domains (even if we have one from any of this, it will be a potential plus) Familiarity of Data Integration domain Data ingestion frameworks Orchestrating complex computational workflows Messaging brokers like Kafka Strong problem solving, troubleshooting and analytical skills Familiarity with Agile process will be an added advantage Excellent communication, presentation, interpersonal and analytical skills including the ability to communicate complex concepts clearly to different audiences. Ability to quickly learn new technologies in a dynamic environment. Design, develop, troubleshoot and debug software programs for databases, applications, tools, networks etc. As a member of the software engineering division, you will assist in defining and developing software for tasks associated with the developing, debugging or designing of software applications or operating systems. Provide technical leadership to other software developers. Specify, design and implement modest changes to existing software architecture to meet changing needs. Duties and tasks are varied and complex needing independent judgment. Fully competent in own area of expertise. May have project lead role and or supervise lower level personnel. BS or MS degree or equivalent experience relevant to functional area. 4 years of software engineering or related experience. Career Level - IC3 Responsibilities Your Responsibilities The job requires you to interface with other internal product development teams as well as cross functional teams (Product Management, Integration Engineering, Quality Engineering, UX team and Technical Writers). At a high level, the work will involve developing features on OCI, which includes but may not be limited to the following: Help drive the next generation Data Integration cloud services using Oracle standard tools, technology and development practices Working directly with product management Working directly with architects to ensure newer capabilities are built applying right design principles Working with remote and geographically distributed teams to enable building the right products, using the right building blocks and making them consumable by other products easily Be very technically hands-on and own/drive key end-to-end services
Posted 1 month ago
5 - 10 years
13 - 18 Lacs
Mumbai
Work from Office
About The Role RoleProduct Manager DurationFull Time LocationEdison, NJ / NYC, NY Mode Hybrid ( 3days WFO & 2 Days WFH) NEED Private Equity / Fund Administrations / Hedge Fund Experience. New York City based Private Equity Fund Administration Firm is looking for a product manager to assist in the implementation of client-facing technology products. This is a high visibility role, with access to senior management and founding members. Primary Responsibilities Will Include: Work closely with SME and management to understand product scope Document current & future state (e.g., functionality, data flow, reports, UX, etc.) Solicit and document requirements from business users Design and document future enhancements to Clients Private Exchange platform Own the platform transaction flow process, design, and development roadmap Assist the client onboarding & implementation teams with technology adoption Define automated and manual testing processes to ensure quality of data Liaise with technologists to further understand and document processes Define Success (aka Acceptance) Criteria to ensure all requirements are fulfilled Build and execute comprehensive software test plans User guide documentation Manage technology deployments and conversions Support technology solutions Manage sub-projects Job Requirements, Skills, Education and Experience: Bachelors degree required. Degree in Accounting, Finance, or Economics, a plus 5+ years of business analysis or business systems analysis experience 2+ years of financial services experience Private Equity experience a plus Extensive experience with Private Equity systems such as Allvue, Investran or eFront Performance, Analytics or Business Intelligence experience a plus Familiar with Agile SDLC and/or Project Management methods a plus Experience with data integration and mapping (e.g., ETL, ELT, etc.) a plus Familiarity with development languages a plus (e.g., VBA, SQL, Python, Java, C++, etc.) Extensive Microsoft Office skills - Excel, Word, Visio, PowerPoint Strong verbal and written communication skills Strong attention to detail and accuracy Ability to learn on-the-job quickly, apply learning to recommend solutions to issues Ability to quickly adapt to changes in prioritization, processes, and procedures Superior problem solving, judgement and decision-making skills Ability to think independently, prioritize, multi-task and meet deadlines
Posted 1 month ago
6 - 11 years
12 - 17 Lacs
Hyderabad
Work from Office
About The Role BI Project Manager - Remote/Hybrid - Hyderabad Responsible for overseeing and managing the implementation of business intelligence projects within an organization. Work closely with stakeholders, including business leaders, IT teams, and data analysts, to define project objectives, scope, and deliverables. Key Responsibilities : 1. Project PlanningDevelop and maintain project plans, including timelines, resource allocation, and budgeting. Identify project risks and develop mitigation strategies. 2. Requirement GatheringCollaborate with business stakeholders to understand their data and reporting needs. Translate business requirements into technical specifications for the BI team. 3. Team ManagementLead a team of data analysts, developers, and other project resources. Assign tasks, monitor progress, and ensure timely delivery of project milestones. 4. Data AnalysisAnalyze and interpret complex data sets to identify trends, patterns, and insights. Use data visualization tools to present findings to stakeholders. 5. Technical ExpertisePossess a strong understanding of BI tools, data warehousing concepts, and data integration techniques. Stay updated with the latest trends and advancements in the BI field. 6. Stakeholder CommunicationFacilitate effective communication between business stakeholders, IT teams, and project resources. Provide regular project updates, address concerns, and manage expectations. 7. Quality AssuranceEnsure the accuracy, completeness, and reliability of data and reports generated by the BI system. Conduct thorough testing and validation to identify and resolve any issues. 8. Change ManagementImplement change management strategies to ensure smooth adoption of new BI solutions. Provide training and support to end-users to maximize the utilization of BI tools. 9. Project DocumentationMaintain comprehensive project documentation, including project plans, requirements, design documents, and user manuals. Ensure documentation is up-to-date and accessible to relevant stakeholders. 10. Continuous ImprovementIdentify opportunities for process improvement and optimization within the BI project lifecycle. Implement best practices and lessons learned from previous projects. Requirements for a BI Project Manager typically include: - Bachelor's degree in Computer Science, Information Systems, or a related field. - Proven experience in managing BI projects, preferably in a project management role. - Strong analytical and problem-solving skills, with the ability to interpret complex data. - Proficiency in BI tools such as Tableau, Power BI, or QlikView. - Knowledge of data warehousing concepts, ETL processes, and data modeling. - Excellent communication and interpersonal skills to effectively collaborate with stakeholders. - Project management certification (e.g., PMP) is a plus. - Familiarity with Agile or Scrum methodologies is desirable. - Ability to work in a fast-paced, dynamic environment and manage multiple projects simultaneously.
Posted 1 month ago
2 - 5 years
14 - 17 Lacs
Bengaluru
Work from Office
As a BigData Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets In this role, your responsibilities may include: As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Big Data Developer, Hadoop, Hive, Spark, PySpark, Strong SQL. Ability to incorporate a variety of statistical and machine learning techniques. Basic understanding of Cloud (AWS,Azure, etc). Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience Basic understanding or experience with predictive/prescriptive modeling skills You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions
Posted 1 month ago
2 - 5 years
6 - 10 Lacs
Bengaluru
Work from Office
Very good experience on Continuous Flow Graph tool used for point based development. Design, develop, and maintain ETL processes using Ab Initio tools. Write, test, and deploy Ab Initio graphs, scripts, and other necessary components. Troubleshoot and resolve data processing issues and improve performance Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Over all 8 Years and Relevant 5+ years Extract, transform, and load data from various sources into data warehouses, operational data stores, or other target systems. Work with different data formats, including structured, semi-structured, and unstructured data Preferred technical and professional experience Effective communication and presentation skills. Industry expertise / specialization
Posted 1 month ago
3 - 7 years
5 - 9 Lacs
Mumbai
Work from Office
About The Role The candidate must possess knowledge relevant to the functional area, and act as a subject matter expert in providing advice in the area of expertise, and also focus on continuous improvement for maximum efficiency. It is vital to focus on the high standard of delivery excellence, provide top-notch service quality and develop successful long-term business partnerships with internal/external customers by identifying and fulfilling customer needs. He/she should be able to break down complex problems into logical and manageable parts in a systematic way, and generate and compare multiple options, and set priorities to resolve problems. The ideal candidate must be proactive, and go beyond expectations to achieve job results and create new opportunities. He/she must positively influence the team, motivate high performance, promote a friendly climate, give constructive feedback, provide development opportunities, and manage career aspirations of direct reports. Communication skills are key here, to explain organizational objectives, assignments, and the big picture to the team, and to articulate team vision and clear objectives. Senior Process Manager Roles and responsibilities: We are seeking a talented and motivated Data Engineer to join our dynamic team. The ideal candidate will have a deep understanding of data integration processes and experience in developing and managing data pipelines using Python, SQL, and PySpark within Databricks. You will be responsible for designing robust backend solutions, implementing CI/CD processes, and ensuring data quality and consistency. Data Pipeline Development: Using Data bricks features to explore raw datasets and understand their structure. Creating and optimizing Spark-based workflows. Create end-to-end data processing pipelines, including ingesting raw data, transforming it, and running analyses on the processed data. Create and maintain data pipelines using Python and SQL. Solution Design and Architecture: Design and architect backend solutions for data integration, ensuring they are robust, scalable, and aligned with business requirements. Implement data processing pipelines using various technologies, including cloud platforms, big data tools, and streaming frameworks. Automation and Scheduling: Automate data integration processes and schedule jobs on servers to ensure seamless data flow. Data Quality and Monitoring: Develop and implement data quality checks and monitoring systems to ensure data accuracy and consistency. CI/CD Implementation: Use Jenkins and Bit bucket to create and maintain metadata and job files. Implement continuous integration and continuous deployment (CI/CD) processes in both development and production environments to deploy data pipelines efficiently. Collaboration and Documentation: Work effectively with cross-functional teams, including software engineers, data scientists, and DevOps, to ensure successful project delivery. Document data pipelines and architecture to ensure knowledge transfer and maintainability. Participate in stakeholder interviews, workshops, and design reviews to define data models, pipelines, and workflows. Technical and Functional Skills: Education and Experience: Bachelors Degree with 7+ years of experience, including at least 3+ years of hands-on experience in SQL/ and Python. Technical Proficiency: Proficiency in writing and optimizing SQL queries in MySQL and SQL Server. Expertise in Python for writing reusable components and enhancing existing ETL scripts. Solid understanding of ETL concepts and data pipeline architecture, including CDC, incremental loads, and slowly changing dimensions (SCDs). Hands-on experience with PySpark. Knowledge and experience with using Data bricks will be a bonus. Familiarity with data warehousing solutions and ETL processes. Understanding of data architecture and backend solution design. Cloud and CI/CD Experience: Experience with cloud platforms such as AWS, Azure, or Google Cloud. Familiarity with Jenkins and Bit bucket for CI/CD processes. Additional Skills: Ability to work independently and manage multiple projects simultaneously.
Posted 1 month ago
6 - 11 years
10 - 14 Lacs
Bengaluru
Work from Office
Wissen Infotech is Hiring for Informatica Developer Experience:5+ Years Location:Bangalore Notice:Immediate Responsibilities: This role is for BI KPI portal operations and Development. Domains including: Data Integration, Data Extraction from Legacy systems, Data Warehousing, efficient in Extract/Load/Transform (ETL)workflows (Informatica PowerCenter and Informatica cloud services) Strong experience in design, development and testing of Informatica based applications (PowerCenter10.2 and Informatica cloud services) Should have Strong knowledge of Oracle database, PL/SQL development, UNIX scripting. Should understand the overall system landscape, upstream and downstream systems Excellent knowledge of debugging, tuning and optimizing performance of database queries Good experience in Data Integration: Data Extraction from legacy systems and Load into Redshift and Redshift spectrum. Supports the module in production, resolves hot issues and implement and deploy enhancements to the application/package Should be proficient in the end-to-end software development life cycle including: Requirement Analysis,Design, Development, Code Review, Testing Responsible for ensuring defect-free and on-time delivery Responsible for Issue resolution along with corrective and preventive measures. Should be able to manage diverse set of Stakeholders and report on key project metric / KPIs Lead brainstorming sessions, provide guidance to team members, identify value creation areas be responsible for quality control Establish standard processes and procedures and promote team collaboration & self[1]improvements Should be able to work on agile Methodologies (Jira)Internal Qualification & Education: Graduate Engineering Degree (B.E. / B.Tech) 4+ years of experience working as Informatica, Informatica cloud, Data warehousing and Unix Scripting 4+ years of experience working in Agile teams Demonstrates strong ability to articulate technical concepts and implications to business partners Excellent communication skills
Posted 1 month ago
3 - 8 years
7 - 11 Lacs
Hyderabad
Work from Office
The Impact you will have in this role: Seeking a skilled Talend Developer with expertise in Power BI development and SQL Server to join our dynamic team. The ideal candidate will be responsible for designing, developing, and maintaining ETL processes using Talend, creating insightful data visualizations with Power BI, and is an expert in writing stored procedures/queries on MS SQL Server databases. What You'll Do: Design, develop, and maintain ETL processes using Talend to extract, transform, and load data from various sources. Create and maintain data visualizations and dashboards using Power BI to provide actionable insights to stakeholders. Write high performance queries on SQL Server databases, ensuring data integrity, performance, and security Collaborate with cross-functional teams to gather requirements, design solutions, and implement data integration and reporting solutions. Troubleshoot and resolve issues related to ETL processes, data visualizations, and database performance Collaborates with other team members and analysts through the delivery cycle. Participates in an Agile delivery team that builds high quality and scalable work products. Supports production releases and maintenance windows working with the Operations team Qualifications: Bachelors degree in computer science, Information Technology, or a related field. Talents Needed for Success: Min 3+ years in writing ETL processes Proven experience as a Talend Developer, with a strong understanding of ETL processes and data integration. Proficiency in Power BI development, including creating dashboards, reports, and data models. Expertise in SQL Server, including database design, optimization, and performance tuning Strong understanding of agile processes (Kanban and Scrum) and a working knowledge of JIRA is required Strong analytical and problem-solving skills, with the ability to work independently and as part of a team. Excellent communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at all levels. Additional Qualifications needed for Success: Talend Expertise : Proficiency in using Talend Studio for data integration, data quality, files manipulation. This includes designing and developing ETL processes, creating and managing Talend jobs, and using Talend components for data transformation and integration Data Integration Knowledge in Talend : Understanding of data integration concepts and best practices. This includes experience with data extraction, transformation, and loading (ETL) processes, as well as knowledge of data warehousing and data modeling. Database Skills : Proficiency in working with various databases, including MS SQL and/or Oracle databases. This includes writing complex SQL queries, understanding database schemas, and performing data migrations. Version Control and Collaboration : Experience with version control systems (e.g., Git) and collaboration tools (e.g., Jira, Confluence). This is important for managing code changes, collaborating with team members, and tracking project progress. Job Scheduling and Automation : Experience with job scheduling and automation tools. This includes setting up and managing Talend jobs using schedulers like Talend Administration Center (TAC), Autosys or third-party tools to automate ETL workflows. Data Visualization : Ability to create visually appealing and insightful reports and dashboards. This involves selecting appropriate visualizations, designing layouts, and using custom visuals when necessary using Power BI Power Query : Expertise in using Power Query for data transformation and preparation. This involves cleaning, merging, and shaping data from various source Expertise in scripting languages such as Python, and Shell/Batch programming is a plus
Posted 1 month ago
5 - 10 years
15 - 20 Lacs
Bengaluru
Work from Office
Key Responsibilities Develop and implement custom Splunk TAs for data parsing and integration Manage deployment and upgrades of Splunk Universal and Heavy Forwarders Oversee configuration and maintenance of the Splunk Deployment Server Upgrade and maintain Splunk Add-On applications and test environments RequiredSkills Strong hands-on experience in Splunk infrastructure and data management Expertise in Troubleshooting and Analytics Excellent Communication and TeamCollaboration skills Resume SubmissionChecklist Notice Period Immediate Joiners or Max 15 Days Only
Posted 1 month ago
3 - 8 years
12 - 15 Lacs
Ahmedabad
Work from Office
Role & responsibilities Design, code, and test complex programs and scripts to enhance existing SAP systems and resolve issues- post-implementation support. Develop APIs system integration through middle ware. Collaborate with stakeholders, IT architects, developers, and infrastructure teams to guarantee smooth application delivery. Manage a team of SAP specialists and mentor them for system implementation maintenance, and support. An upgrade to SAP Business One that includes new features like mobile users, etc. Optimize the SAP Business One. SAP S/4 Hana Process by the Business Requirements in SAP Implementation throughout the business Unit and promptly assist with IT support. Working with each HOD to coordinate the updating of the SAP IT protocol SOP. Supporting technical tasks and assigning IT tasks to complete and execution. Responding to SAP calls daily and completing tasks from team members and outside consultants Vendor management, SLA management, and IT procurement and negotiations. Managing internal clients and comprehending their needs about the technical solution from a maintenance standpoint. Evaluation, analysis and assessment of new technologies and solutions, as well as derivation of methods and operating concepts for the creation of operating standards for SAP operational landscapes. Leverage IT security and application of IT policies, inline with IT Department. Preferred candidate profile SAP Business One, SAP S/4 Hana and its Installation and upgrade SAP Applications standard business processes, configurations, Setups and Standards. Extensive Knowledge of API Design Technical expertise in Crystal report, BI Objects Hands-on experience in SAP Coding *& Iteration Power BI
Posted 1 month ago
8 - 13 years
30 - 35 Lacs
Bengaluru
Work from Office
Power BI Architect - J48917 As a Data Engineer BI Analytics & DWH , you will play a pivotal role in designing and implementing comprehensive business intelligence solutions that empower our organization to make data-driven decisions. You will leverage your expertise in Power BI, Tableau, and ETL processes to create scalable architectures and interactive visualizations. This position requires a strategic thinker with strong technical skills and the ability to collaborate effectively with stakeholders at all levels. Key Responsibilities: BI Architecture & DWH Solution Design: Develop and design scalable BI Analytical & DWH Solution that meets business requirements, leveraging tools such as Power BI and Tableau. Data Integration: Oversee the ETL processes using SSIS to ensure efficient data extraction, transformation, and loading into data warehouses. Data Modelling: Create and maintain data models that support analytical reporting and data visualization initiatives. Database Management: Utilize SQL to write complex queries, stored procedures, and manage data transformations using joins and cursors. Visualization Development: Lead the design of interactive dashboards and reports in Power BI and Tableau, adhering to best practices in data visualization. Collaboration: Work closely with stakeholders to gather requirements and translate them into technical specifications and architecture designs. Performance Optimization: Analyse and optimize BI solutions for performance, scalability, and reliability. Data Governance: Implement best practices for data quality and governance to ensure accurate reporting and compliance. Team Leadership: Mentor and guide junior BI developers and analysts, fostering a culture of continuous learning and improvement. Azure Databricks: Leverage Azure Databricks for data processing and analytics, ensuring seamless integration with existing BI solutions. Required Candidate profile Candidate Experience Should Be : 8 To 15 Candidate Degree Should Be : BE-Comp/IT
Posted 1 month ago
3 - 6 years
10 - 14 Lacs
Mumbai
Work from Office
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You’ll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you’ll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you’ll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviour’s. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in the integration efforts between Alation and Manta, ensuring seamless data flow and compatibility. Collaborate with cross-functional teams to gather requirements and design solutions that leverage both Alation and Manta platforms effectively. Develop and maintain data governance processes and standards within Alation, leveraging Manta's data lineage capabilities. Analyze data lineage and metadata to provide insights into data quality, compliance, and usage patterns Preferred technical and professional experience Lead the evaluation and implementation of new features and updates for both Alation and Manta platforms Ensuring alignment with organizational goals and objectives. Drive continuous improvement initiatives to enhance the efficiency and effectiveness of data management processes, leveraging Alati
Posted 1 month ago
2 - 5 years
6 - 10 Lacs
Bengaluru
Work from Office
IICS - CDI Cloud Data Integration and CAI Cloud Application Integration These two are mandatory skills Mostly IICS means we find people with CDI only but here in Day and Ross we are using both CAI and CDI. As the requirement is for a lead role. Understanding of business processes and how data supports business decision making. Person should be good in communication with client, and manage a team of 2/3 resources Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 8-12 Years and 4+ years of experience in Informatica Intelligent Cloud Services IICS components . -Should have working experience in Informatica Intelligent Cloud Services IICS components - application integration, data integration Experience with IDMC administration and architecture in handling the admin related activities Should have experience in developing application integration using SOAP/REST APIs, APIs leveraged to integrate multiple systems. Functional knowledge in using postman for testing APIs. Strong SQL Skills and performance tuning capabilities. Excellent knowledge on Informatica platform as a whole and the integration among different Informatica components and services Preferred technical and professional experience Candidate should have good communication skills in handling clients and leading a team. Industry expertise / specialization
Posted 1 month ago
2 - 5 years
6 - 10 Lacs
Hyderabad
Work from Office
As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the client’s needs. Your primary responsibilities include Design, build, optimize and support new and existing data models and ETL processes based on our client’s business requirements Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Design, develop, and maintain Ab Initio graphs for extracting, transforming, and loading (ETL) data from diverse sources to various target systems. Implement data quality and validation processes within Ab Initio. Data Modelling and Analysis. Collaborate with data architects and business analysts to understand data requirements and translate them into effective ETL processes. Analyse and model data to ensure optimal ETL design and performance. Ab Initio Components, Utilize Ab Initio components such as Transform Functions, Rollup, Join, Normalize, and others to build scalable and efficient data integration solutions. Implement best practices for reusable Ab Initio components Preferred technical and professional experience Optimize Ab Initio graphs for performance, ensuring efficient data processing and minimal resource utilization. Conduct performance tuning and troubleshooting as needed. Collaboration. Work closely with cross-functional teams, including data analysts, database administrators, and quality assurance, to ensure seamless integration of ETL processes. Participate in design reviews and provide technical expertise to enhance overall solution quality. Documentation
Posted 1 month ago
2 - 5 years
6 - 10 Lacs
Bengaluru
Work from Office
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviour’s. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Your primary responsibilities include: Develop & maintain data pipelines for batch & stream processing using informatica power centre or cloud ETL/ELT tools. Liaise with business team and technical leads, gather requirements, identify data sources, identify data quality issues, design target data structures, develop pipelines and data processing routines, perform unit testing and support UAT. Work with data scientist and business analytics team to assist in data ingestion and data-related technical issues Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Expertise in Data warehousing/ information Management/ Data Integration/Business Intelligence using ETL tool Informatica PowerCenter Knowledge of Cloud, Power BI, Data migration on cloud skills. Experience in Unix shell scripting and python Experience with relational SQL, Big Data etc Preferred technical and professional experience Knowledge of MS-Azure Cloud Experience in Informatica PowerCenter Experience in Unix shell scripting and python
Posted 1 month ago
5 - 10 years
9 - 19 Lacs
Bengaluru, Gurgaon, Mumbai (All Areas)
Hybrid
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Data Modeling Techniques and Methodologies Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have Skills : Data Modeling Techniques and Methodologies, SSI: NON SSI: Good to Have Skills :SSI: Data Engineering, Cloud Data Migration NON SSI : Job Requirements : Key Responsibilities : 1Drive discussions with clients deal teams to understand business requirements, how Industry Data Model fits in implementation and solutioning 2Develop the solution blueprint and scoping, estimation, staffing for delivery project and solutioning 3Drive Discovery activities and design workshops with client and lead strategic road mapping and operating model design discussions 4Good to have Data Vault,Cloud DB design,Graph data modeling, Ontology, Data Engineering,Data Lake design Technical Experience : 1 7plus year overall exp,3plus Data Modeling,Cloud DB Model,3NF,Dimensional,Conversion of RDBMS data model to Graph Data ModelInstrumental in DB design through all stages of Data Model 2 Exp on at least one Cloud DB Design work must be familiar with Data Architecture Principles Professional Attributes : 1Strong requirement analysis and technical solutioning skill in Data and Analytics 2Excellent writing, communication and presentation skills 3Eagerness to learn and develop self on an ongoing basis 4Excellent client facing and interpersonal skills
Posted 1 month ago
5 - 10 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : IBM InfoSphere DataStage Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements. Your typical day will revolve around creating solutions that align with business needs and application specifications. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead the team in implementing innovative solutions Conduct regular team meetings to ensure project progress Stay updated on industry trends and technologies Professional & Technical Skills: Must To Have Skills: Proficiency in IBM InfoSphere DataStage Strong understanding of ETL processes Experience in data integration and transformation Knowledge of data warehousing concepts Hands-on experience in troubleshooting and debugging DataStage jobs Additional Information: The candidate should have a minimum of 5 years of experience in IBM InfoSphere DataStage This position is based at our Bengaluru office A 15 years full-time education is required Qualification 15 years full time education
Posted 1 month ago
7 - 12 years
5 - 9 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BusinessObjects Data Services Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will collaborate with teams to ensure successful project delivery and contribute to key decisions. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead and mentor junior professionals Stay updated with the latest technologies and trends Conduct regular knowledge sharing sessions Professional & Technical Skills: Must To Have Skills: Proficiency in SAP BusinessObjects Data Services Strong understanding of ETL processes Experience in data integration and data quality management Knowledge of SAP BusinessObjects reporting tools Hands-on experience in data modeling and data warehousing Additional Information: The candidate should have a minimum of 7.5 years of experience in SAP BusinessObjects Data Services This position is based at our Hyderabad office A 15 years full-time education is required Qualification 15 years full time education
Posted 1 month ago
3 - 8 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : SAP BusinessObjects Data Services Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. Your typical day will involve collaborating with stakeholders to understand business needs and translating them into functional design solutions. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Collaborate with stakeholders to gather and analyze requirements. Design and develop applications based on business process requirements. Implement best practices for application design and development. Conduct testing and debugging to ensure application functionality. Provide technical support and guidance to team members. Professional & Technical Skills: Must To Have Skills: Proficiency in SAP BusinessObjects Data Services. Strong understanding of data integration and ETL processes. Experience with data modeling and database design. Knowledge of SAP BusinessObjects reporting tools. Hands-on experience in application design and development. Additional Information: The candidate should have a minimum of 3 years of experience in SAP BusinessObjects Data Services. This position is based at our Bengaluru office. A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
3 - 8 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : SAP BusinessObjects Data Services Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. Your typical day will involve collaborating with stakeholders to understand business needs and translating them into functional design solutions. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Collaborate with stakeholders to gather and analyze requirements. Design and develop applications based on business process requirements. Implement best practices for application design and development. Conduct testing and debugging to ensure application functionality. Provide technical support and guidance to team members. Professional & Technical Skills: Must To Have Skills: Proficiency in SAP BusinessObjects Data Services. Strong understanding of ETL processes and data integration. Experience with data modeling and database design. Knowledge of SAP BusinessObjects reporting tools. Familiarity with software development lifecycle methodologies. Additional Information: The candidate should have a minimum of 3 years of experience in SAP BusinessObjects Data Services. This position is based at our Bengaluru office. A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20183 Jobs | Dublin
Wipro
10025 Jobs | Bengaluru
EY
8024 Jobs | London
Accenture in India
6531 Jobs | Dublin 2
Amazon
6260 Jobs | Seattle,WA
Uplers
6244 Jobs | Ahmedabad
Oracle
5916 Jobs | Redwood City
IBM
5765 Jobs | Armonk
Capgemini
3771 Jobs | Paris,France
Tata Consultancy Services
3728 Jobs | Thane