Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 12.0 years
13 - 17 Lacs
Chennai
Work from Office
Are you a visionary who thrives on designing future-ready data ecosystems? Let s build the next big thing together! Were working with top retail and healthcare leaders to transform how they harness data and we re looking for a Data Architect to guide that journey. We are looking for an experienced Data Architect with deep knowledge of Databricks and cloud-native data architecture. This role will drive the design and implementation of scalable, high-performance data platforms to support advanced analytics, business intelligence, and data science initiatives within a retail or healthcare environment. Key Responsibilities: Define and implement enterprise-level data architecture strategies using Databricks. Design end-to-end data ecosystems including ingestion, transformation, storage, and access layers. Lead data governance, data quality, and security initiatives across the organization. Work with stakeholders to align data architecture with business goals and compliance requirements. Guide the engineering team on best practices in data modeling, pipeline development, and system optimization. Champion the use of Delta Lake, Lakehouse architecture, and real-time analytics. Required Qualifications: 8+ years of experience in data architecture or solution architecture roles. Strong expertise in Databricks, Spark, Delta Lake, and data warehousing concepts. Solid understanding of modern data platform tools (Snowflake, Azure Synapse, BigQuery, etc.). Experience with cloud architecture (Azure preferred), data governance, and MDM. Strong understanding of healthcare or retail data workflows and regulatory requirements. Excellent communication and stakeholder management skills. Benefits: Health Insurance, Accident Insurance. The salary will be determined based on several factors including, but not limited to, location, relevant education, qualifications, experience, technical skills, and business needs. Additional Responsibilities: Participate in OP monthly team meetings, and participate in team-building efforts. Contribute to OP technical discussions, peer reviews, etc. Contribute content and collaborate via the OP-Wiki/Knowledge Base. Provide status reports to OP Account Management as requested. About us: OP is a technology consulting and solutions company, offering advisory and managed services, innovative platforms, and staffing solutions across a wide range of fields including AI, cyber security, enterprise architecture, and beyond. Our most valuable asset is our people: dynamic, creative thinkers, who are passionate about doing quality work. As a member of the OP team, you will have access to industry-leading consulting practices, strategies & and technologies, innovative training & education. An ideal OP team member is a technology leader with a proven track record of technical excellence and a strong focus on process and methodology.
Posted 2 months ago
14.0 - 17.0 years
12 - 17 Lacs
Pune
Work from Office
Experience required 10-15 Position Fulltime Mode Hybrid . Need guy from Pune as location of work is Pashan Pune . Need utmost technical person . Role would be 40 percent managerial 60 percent technical . Roles and responsibility : Collaborate with internal teams to produce software design an architecture Write clean, scalable code using .NET programming languages (.net core and framework) Prepare and maintain code for various .Net applications and resolve any defects in systems. Revise, update, refactor, and debug code Improve existing software Develop documentation throughout the software development Monitor everyday activities of the system and Serve as an expert on applications and provide technical support. Preference Experience 8 + yrs Excellent communication skills Ability for critical thinking & creativity Having a systematic and logical approach to problem-solving, team working skills Provide expert advice to project teams on the use of integration technology, data architecture, modelling, and system architecture including integration best practices. Communicate project status to various levels of management. Manage an Integration/Architecture Roadmap and project backlog in partnership with the R&D leadership team, prioritize initiatives in line with business goals, and drive design and deployment of integration solutions that enable scalability, high availability, and re-use. Also needs hands experience for .NET/Java Language AI/ Azure Open AI knowledge is a big plus Requirement Good expertise in the MS entity framework/Dapper Proven experience as a .NET Developer Familiarity with the .NET framework, SQL Server & design/architectural patterns Model-View-Controller (MVC)) Familiarity with working of asp dot net core application Knowledge of at least one of the .NET languages (e.g. C# ..) Familiarity with architecture styles/APIs (REST, RPC) Experience with alerting mechanisms for API s in case of any failures. Understanding of Agile methodologies Good troubleshooting and communication skills Experience with concurrent development source control (Git)
Posted 2 months ago
12.0 - 13.0 years
35 - 60 Lacs
Bengaluru
Work from Office
ECMS # 528867 Number of Openings 1 Duration of project 6 months Initially No of years experience 12 Years Detailed job description - Skill Set: Scroll down for the JD Mandatory Skills Databricks, Python, Pyspark, Architect Vendor Billing range (local currency /Day) 10000 INR/DAY Work Location BANGALORE/PUNE Hybrid/Remote/WFO Hybrid BGV Pre/Post onboarding Pre on-boarding. Any particular shift timings General shift whereas need to extend couple of hours if required. Role Data Architect In the role of Data Architect, you will interface with key stakeholders and apply your knowledge for understanding the business and business data across source systems. You will play an important role in creating a detailed business data understanding, outlining problems, opportunities, and data solutions for a business. Basic Bachelors degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education. At least 12+ years of experience with Information Technology and 5 + years in Data Architect Extensive experience in Design and Architecture of large data transformation systems. Preferred Understanding of the business area that the project is involved with. Working with data stewards to understand the data sources. Clear understanding of data entities, relationships, cardinality etc for the inbound sources based on inputs from the data stewards / source system experts. Performance tuning understanding the overall requirement, reporting impact. Data Modeling for the business and reporting models as per the reporting needs or delivery needs to other downstream systems. Have experience to components and languages like Databricks, Python, PySpark, SCALA, R. Ability to ask strong questions to help the team see areas that may lead to problems. Ability to validate the data by writing sql queries and compare against the source system and transformation mapping. Work closely with teams to collect and translate information requirements into data to develop data-centric solutions. Ensure that industry-accepted data architecture principles and standards are integrated and followed for modeling, stored procedures, replication, regulations, and security, among other concepts, to meet technical and business goals. Continuously improve the quality, consistency, accessibility, and security of our data activity across company needs. Experience on Azure DevOps project tracking tool or equivalent tools like JIRA. Should have Outstanding verbal, non-verbal communication. Should have experience and desire to work in a Global delivery environment.
Posted 2 months ago
1.0 - 7.0 years
13 - 14 Lacs
Bengaluru
Work from Office
Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decisionmaking and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC & Summary Job Overview We are seeking a highly skilled and versatile polyglot Full Stack Developer with expertise in modern frontend and backend technologies, cloudbased solutions, AI/ML and Gen AI. The ideal candidate will have a strong foundation in fullstack development, cloud platforms (preferably Azure), and handson experience in Gen AI, AI and machine learning technologies. Key Responsibilities Develop and maintain web applications using Angular / React.js , .NET , and Python . Design, deploy, and optimize Azure native PaaS and SaaS services, including but not limited to Function Apps , Service Bus , Storage Accounts , SQL Databases , Key vaults, ADF, Data Bricks and REST APIs with Open API specifications. Implement security best practices for data in transit and rest. Authentication best practices SSO, OAuth 2.0 and Auth0. Utilize Python for developing data processing and advanced AI/ML models using libraries like pandas , NumPy , scikitlearn and Langchain , Llamaindex , Azure OpenAI SDK Leverage Agentic frameworks like Crew AI, Autogen etc. Well versed with RAG and Agentic Architecture. Strong in Design patterns Architectural, Data, Object oriented Leverage azure serverless components to build highly scalable and efficient solutions. Create, integrate, and manage workflows using Power Platform , including Power Automate , Power Pages , and SharePoint . Apply expertise in machine learning , deep learning , and Generative AI to solve complex problems. Primary Skills Proficiency in React.js , .NET , and Python . Strong knowledge of Azure Cloud Services , including serverless architectures and data security. Experience with Python Data Analytics libraries pandas NumPy scikitlearn Matplotlib Seaborn Experience with Python Generative AI Frameworks Langchain LlamaIndex Crew AI AutoGen Familiarity with REST API design , Swagger documentation , and authentication best practices . Secondary Skills Experience with Power Platform tools such as Power Automate, Power Pages, and SharePoint integration. Knowledge of Power BI for data visualization (preferred). Preferred Knowledge Areas Nice to have Indepth understanding of Machine Learning , deep learning, supervised, unsupervised algorithms. Mandatory skill sets AI, ML Preferred skill sets AI, ML Years of experience required 3 7 years Education qualification BE/BTECH, ME/MTECH, MBA, MCA Education Degrees/Field of Study required Bachelor of Technology, Master of Business Administration, Bachelor of Engineering, Master of Engineering Degrees/Field of Study preferred Required Skills Game AI Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 28 more} Travel Requirements Government Clearance Required?
Posted 2 months ago
12.0 - 18.0 years
40 - 45 Lacs
Bengaluru
Work from Office
Not Applicable Specialism Data, Analytics & AI Management Level Director & Summary . & Summary We are seeking an experienced Senior Data Architect to lead the design and development of our data architecture, leveraging cloudbased technologies, big data processing frameworks, and DevOps practices. The ideal candidate will have a strong background in data warehousing, data pipelines, performance optimization, and collaboration with DevOps teams. Responsibilities 1. Design and implement endtoend data pipelines using cloudbased services (AWS/ GCP/Azure) and conventional data processing frameworks. 2. Lead the development of data architecture, ensuring scalability, security, and performance. 3. Collaborate with crossfunctional teams, including DevOps, to design and implement data lakes, data warehouses, and data ingestion/extraction processes. 4. Develop and optimize data processing workflows using PySpark, Kafka, and other big data processing frameworks. 5. Ensure data quality, integrity, and security across all data pipelines and architectures. 6. Provide technical leadership and guidance to junior team members. 7. Design and implement data load strategies, data partitioning, and data storage solutions. 8. Collaborate with stakeholders to understand business requirements and develop data solutions to meet those needs. 9. Work closely with DevOps team to ensure seamless integration of data pipelines with overall system architecture. 10. Participate in design and implementation of CI/CD pipelines for data workflows. DevOps Requirements 1. Knowledge of DevOps practices and tools, such as Jenkins, GitLab CI/CD, or Apache Airflow. 2. Experience with containerization using Docker. 3. Understanding of infrastructure as code (IaC) concepts using tools like Terraform or AWS CloudFormation. 4. Familiarity with monitoring and logging tools, such as Prometheus, Grafana, or ELK Stack. Requirements 1. 1214 years of experience for Senior Data Architect in data architecture, data warehousing, and big data processing. 2. Strong expertise in cloudbased technologies (AWS/ GCP/ Azure) and data processing frameworks (PySpark, Kafka, Flink , Beam etc.). 3. Experience with data ingestion, data extraction, data warehousing, and data lakes. 4. Strong understanding of performance optimization, data partitioning, and data storage solutions. 5. Excellent leadership and communication skills. 6. Experience with NoSQL databases is a plus. Mandatory skill sets 1. Experience with agile development methodologies. 2. Certification in cloudbased technologies (AWS / GCP/ Azure) or data processing frameworks. 3. Experience with data governance, data quality, and data security. Preferred skill sets Knowledge of AgenticAI and GenAI is added advantage Years of experience required 12 to 18 years Education qualification Graduate Engineer or Management Graduate Education Degrees/Field of Study required Bachelor of Engineering Degrees/Field of Study preferred Required Skills AWS Devops Accepting Feedback, Accepting Feedback, Active Listening, Analytical Reasoning, Analytical Thinking, Application Software, Business Data Analytics, Business Management, Business Technology, Business Transformation, Coaching and Feedback, Communication, Creativity, Documentation Development, Embracing Change, Emotional Regulation, Empathy, Implementation Research, Implementation Support, Implementing Technology, Inclusion, Influence, Innovation, Intellectual Curiosity, Learning Agility {+ 28 more} Travel Requirements Government Clearance Required?
Posted 2 months ago
3.0 - 5.0 years
18 - 20 Lacs
Pune
Work from Office
Good Hand on experience on ETL and BI tools like SSIS, SSRS, Power BI etc. Readiness to play an individual contributor role on the technical front Excellent communication skills Readiness to travel onsite for short term, as required A good experience in ETL development for 3-5 years and with hands-on experience in a migration or data warehousing project Should have strong database fundamentals and experience in writing Unit test cases and test scenarios Expert knowledge in writing SQL commands, queries and stored procedures Good knowledge of ETL tools like SSIS, Informatica, etc. and data warehousing concepts Should have good knowledge in writing macros Good client handling skills with preferred onsite experience Thank you for considering employment with Fiserv. Please: Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our commitment to Diversity and Inclusion: Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note to agencies: Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning about fake job posts: Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address. Share this Job Email LinkedIn X Facebook
Posted 2 months ago
3.0 - 5.0 years
18 - 20 Lacs
Pune
Work from Office
A Conversion Professional is responsible for timely and accurate conversion of new and existing Bank/Client data to Fiserv systems, from both internal and external sources. This role is responsible for providing data analysis for client projects and to accommodate other ad hoc data updates to meet client requests. As part of the overall Service Delivery organization, a Conversion Professional plays a critical role in mapping in data to support project initiatives for new and existing banks. Working with financial services data means a high priority on accuracy and adherence to procedures and guidelines. What will you do A Conversion Professional is responsible for timely and accurate conversion of new and existing Bank/Client data to Fiserv systems, from both internal and external sources. This role is responsible for providing data analysis for client projects and to accommodate other ad hoc data updates to meet client requests. As part of the overall Service Delivery organization, a Conversion Professional plays a critical role in mapping in data to support project initiatives for new and existing banks. Working with financial services data means a high priority on accuracy and adherence to procedures and guidelines. The person stepping in as the backup would need to review the specifications history and then review and understand the code that was being developed to resolve the issue and or change. This would also have to occur on the switch back to the original developer. Today, the associate handling the project would log back in to support the effort and address the issue and or change. What you will need to have Bachelor s degree in programming or related field Minimum 3 years relevant experience in data processing (ETL) conversions or financial services industry 3 5 years Experience and strong knowledge of MS SQL/PSQL, MS SSIS and data warehousing concepts Strong communication skills and ability to provide technical information to non-technical colleagues. Team players with ability to work independently. Experience in full software development life cycle using agile methodologies. Should have good understanding of Agile methodologies and can handle agile ceremonies. Efficient in Reviewing, coding, testing, and debugging of application/Bank programs. Should be able to work under pressure while resolving critical issues in Prod environment. Good communication skills and experience in working with Clients. Good understanding in Banking Domain. What would be great to have Experience with Informatica, Power BI, MS Visual Basic, Microsoft Access and Microsoft Excel required. Experience with Card Management systems, debit card processing is a plus Strong communication skills and ability to provide technical information to non-technical colleagues Ability to manage and prioritize work queue across multiple workstreams Team player with ability to work independently Highest attention to detail and accuracy Thank you for considering employment with Fiserv. Please: Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our commitment to Diversity and Inclusion: Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note to agencies: Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning about fake job posts: Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address. Share this Job Email LinkedIn X Facebook
Posted 2 months ago
3.0 - 8.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary . In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC Handson experience in Pyspark, preferably more than 3 years familiarity with RDD level programming as well as coding using Spark DataFrames API. Ability to develop and maintain data pipelines and ETLs using Python and Pyspark. Should have good knowledge of Python and Spark concepts like Driver vs Worker, actions and transforms, data partitioning and bucketing etc. Experience in data management activities and data lifecycle management for example activities related to integrating source databases to a data lake via full batch or incremental data updates. Ability to design, implement, and optimize Spark jobs for performance and scalability. Perform data analysis and troubleshooting to ensure data quality and reliability. Experience in cloud technologies esp. object storage for data lakes is a must. Experience in Spark Streaming, GraphX is a plus. Mandatory skill sets Pyspark Preferred skill sets Pyspark Years of experience required 5+ Education qualification BE/BTech/MBA/MCA Education Degrees/Field of Study required Bachelor of Engineering, Master of Business Administration, Bachelor of Technology Degrees/Field of Study preferred Required Skills PySpark Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 12 more} No
Posted 2 months ago
15.0 - 20.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve application performance and user experience. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Strong understanding of data integration and ETL processes.- Experience with data warehousing concepts and methodologies.- Familiarity with SQL and database management systems.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 7.5 years of experience in Ab Initio.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 months ago
12.0 - 17.0 years
3 - 7 Lacs
Kolkata
Work from Office
Project Role : Data Management Practitioner Project Role Description : Maintain the quality and compliance of an organizations data assets. Design and implement data strategies, ensuring data integrity and enforcing governance policies. Establish protocols to handle data, safeguard sensitive information, and optimize data usage within the organization. Design and advise on data quality rules and set up effective data compliance policies. Must have skills : Data Architecture Principles Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : any graduate Summary :As a Data Management Practitioner, you will be responsible for maintaining the quality and compliance of an organization's data assets. Your role involves designing and implementing data strategies, ensuring data integrity, enforcing governance policies, and optimizing data usage within the organization. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Design and advise on data quality rules- Set up effective data compliance policies- Ensure data integrity and enforce governance policies- Optimize data usage within the organization Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Architecture Principles- Strong understanding of data management best practices- Experience in designing and implementing data strategies- Knowledge of data governance and compliance policies- Ability to optimize data usage for organizational benefit Additional Information:- The candidate should have a minimum of 12 years of experience in Data Architecture Principles- This position is based at our Kolkata office- A degree in any graduate is required Qualification any graduate
Posted 2 months ago
5.0 - 10.0 years
8 - 12 Lacs
Pune, Bengaluru, Hinjewadi
Work from Office
Job Summary Synechron is seeking a dedicated Murex Datamart Reporting Support Engineer at the mid-level to support and optimize the reporting functions within our financial systems infrastructure. This role focuses on providing effective tier 2 support for Datamart and reporting modules, resolving incidents, and ensuring the accuracy and availability of reports such as P&L, Market Valuation, Accounting, and Risk. The successful candidate will bring solid technical expertise, analytical skills, and communication abilities to collaborate with technical and business stakeholders, supporting operational excellence and continuous improvement. This position plays a crucial role in maintaining reliable reporting outputs, resolving data issues efficiently, and supporting strategic reporting initiatives aligned with business needs. Software Requirements Required Skills: Proficiency with Murex (version 3.1 or above) focusing on Datamart and reporting modules Strong SQL skills for data querying, analysis, and solving data-related issues Shell scripting ( Bash/sh ) for automation and troubleshooting tasks Experience supporting report generation (P&L, MV, Risk, Accounting) Familiarity with incident management tools such as ServiceNow or JIRA Preferred Skills: Experience with reporting tools such as PowerBI, Tableau, or QlikView Knowledge of data warehousing and data architecture concepts Basic scripting in Python or Perl for automation tasks Overall Responsibilities Support and maintain Murex Datamart and reporting modules, ensuring system stability and report integrity Respond to and resolve L2 support tickets related to report discrepancies, data issues, and system errors Collaborate with business users to understand reporting requirements, diagnose issues, and implement solutions Perform data analysis and troubleshooting using SQL queries to identify root causes of problems Assist in system upgrades, patching, and configuration changes impacting reporting environments Automate routine report validation tasks and improve existing processes to enhance efficiency Document problem resolutions, configurations, and procedures for team knowledge sharing Support incident escalations, communicate effectively with stakeholders, and prioritize support activities Participate in continuous improvement initiatives to enhance report accuracy, timeliness, and system performance Strategic objectives: Ensure high availability and accuracy of critical reports Reduce report incidents and data inconsistencies Automate manual processes to improve operational efficiency Performance outcomes: Consistent high-quality report availability Rapid incident resolution with minimal business disruption Clear documentation and proactive stakeholder communication Technical Skills (By Category) Reporting & Data Analysis (Essential): Experience supporting Murex Datamart , especially related to P&L, MV, Risk, and Accounting reports SQL mastery for data extraction, validation, and issue diagnosis Knowledge of report configuration and static data management Scripting & Automation (Essential): Shell scripting ( Bash/sh ) for automating data checks, batch processes, and troubleshooting Experience in automating routine report validation and data reconciliation Data Management & Architecture (Essential): Understanding of relational databases, data flow, and data warehousing concepts Experience with data definitions, static data, and interface setup Support & Incident Management (Essential): Hands-on use of ServiceNow , JIRA , or equivalent service management tools Additional Skills (Preferred): Basic knowledge of cloud deployment environments Familiarity with additional scripting languages like Python or Perl Experience Requirements 5+ years of production support experience supporting Murex Datamart and reporting modules Proven experience in resolving report and data-related issues efficiently Support experience in a support or operational role in financial services (trading, risk, or accounting) Experience working in structured support environments with incident escalation and resolution Alternative pathways: Candidates demonstrating strong support skills, extensive scripting experience, and deep understanding of financial reporting can be considered irrespective of exact years if their expertise is aligned Day-to-Day Activities Monitor system dashboards, reports, and alerts for performance issues or data discrepancies Troubleshoot and resolve report failures and data anomalies using SQL and scripts Engage with business users to clarify reporting needs and resolve issues Support system upgrades, patches, and configurations affecting reporting modules Automate manual validation routines to improve reliability and efficiency Document resolutions, configurations, and operational procedures Collaborate with technical teams, support units, and stakeholders for incident resolution Participate in shift handovers, incident reviews, and ongoing process improvements Qualifications Bachelors degree in Computer Science, Finance, Data Management, or a related discipline 5+ years supporting Murex Datamart and Reporting modules in a production environment Strong SQL and shell scripting expertise Knowledge of financial reporting processes such as P&L, MV, Risk, and Accounting Experience supporting high-pressure environments, managing incidents, and problem resolution Willingness to work in shifts, including nights, weekends, or holidays as needed Professional Competencies Critical thinking and analytical skills to troubleshoot complex issues Effective communication skills for liaising with technical teams and business stakeholders Collaboration skills to support cross-team coordination and problem-solving Ability to work independently, prioritize workloads, and manage multiple issues efficiently Adaptability and willingness to learn new tools and processes Focus on continuous enhancement of operational procedures and system stability
Posted 2 months ago
7.0 - 12.0 years
25 - 35 Lacs
Pune
Hybrid
Responsibilities: NetSuite Modules & Customization: Excellent understanding and Hands-on experience in NetSuite modules like Order-to-Cash, Procure-to-Pay, Record-to-Report, Inventory Management, Fixed Assets Management, Revenue Recognition, Lease Management, Billing Schedules, Advanced Procurement, Advanced Financials, Intercompany management, Multi-Book Accounting and Taxation. Monitor and administer account management, roles, user access, profile creation, security administration. Develop advanced Complex Financial reports and Saved Searches, Custom Records, Dashboards, and Analytics to provide actionable insights for executive decision-making Working knowledge of CSV imports, Advanced PDF, Report analytics, Suite scripts and Testing procedures to ensure operational reliability. Create and optimize workflows to streamline financial processes and enhance operational efficiency Develop and maintain a comprehensive architectural vision for NetSuite as the central financial system of record Design scalable, future-proof NetSuite solutions that align with business objectives and best practices Provide guidance on system optimization, customization, and integration strategies Lead the evaluation and implementation of advanced NetSuite features and modules to enhance financial operations Establish and enforce NetSuite best practices, and governance policies Training, informing and supporting users regarding System functionality, Enhancements, System configuration and best practices. Ability to communicate effectively with both technical and non-technical teams. Troubleshoot NetSuite functionality bugs, reporting errors, and integration malfunctions. Maintain up-to-date knowledge of NetSuite functionality on new releases, customizations, and integrations Communicate and Collaborate with multiple teams to align NetSuite strategy with business objectives Support cross-functional initiatives to drive technological innovation in financial processes Provide internal knowledge transfer to develop team capabilities in NetSuite Financial Systems Expertise: Partner with Finance, Accounting, and FP&A teams to understand their requirements and build the system accordingly Design and implement advanced modules, reconciliation processes, and audit trails within NetSuite Understand requirements for complex financial scenarios including revenue recognition, multi-entity consolidation, and advanced billing Develop and optimize financial closing processes, financial reporting, and compliance frameworks Ensure NetSuite configurations adhere to accounting standards, financial regulations, and audit requirements Automation Implementation: Lead the adoption of AI-powered features within NetSuite to enhance financial forecasting, anomaly detection, and decision support Design and implement intelligent automation solutions for financial processes Develop smart workflows that leverage AI to streamline approvals, enhance compliance, and reduce manual intervention Implement solutions that utilize NetSuite's intelligent automation capabilities to improve financial accuracy and efficiency Stay at the forefront of emerging AI technologies applicable to NetSuite and financial systems Integration & Data Architecture: Support enterprise-grade integrations between NetSuite and other critical business systems Implement robust data models that ensure data consistency, integrity, and traceability across systems Adhere to the strategies for handling large volumes of financial data while maintaining system performance Implement and support data migration, transformation, and validation efforts for financial system implementations Required Skill Sets and Qualifications: Bachelor's degree in computer science/ engineering, Business Administration, Finance/Accounting or equivalent practical experience. Minimum of 6+ years of relevant experience with NetSuite, including at least 2 full cycle end to end implementation experience Strong understanding of accounting principles, financial reporting requirements, customization and workflows Experience in implementing NetSuite modules for complex financial operations and support integrations Excellent communication skills and ability to translate complex technical concepts to non-technical stakeholders Expectations on working hours will be 2 pm to 11 pm IST to align with USA mornings Preferred Qualifications: NetSuite certifications is a plus Experience with integrations and ETL tool is preferred Knowledge of AI/ML technologies and their application to financial systems Experience with cloud-based integration platforms (Dell Boomi, MuleSoft) Experience in the energy efficiency or sustainability industry Knowledge of advanced financial planning tools in order to support integration with NetSuite Experience with Agile development methodologies in an enterprise environment Familiarity with other ERP systems and best practices for system integration This position offers an exceptional opportunity to shape and support the evolution of our financial systems landscape, driving innovation and excellence in a dynamic, growing organization.
Posted 2 months ago
8.0 - 11.0 years
15 - 19 Lacs
Kolkata
Work from Office
Project Role : Technology Architect Project Role Description : Review and integrate all application requirements, including functional, security, integration, performance, quality and operations requirements. Review and integrate the technical architecture requirements. Provide input into final decisions regarding hardware, network products, system software and security. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Cloud Data ArchitectureMinimum 5 year(s) of experience is required Educational Qualification : BE or MCA Summary :As a Technology Architect, you will be responsible for reviewing and integrating all application requirements, including functional, security, integration, performance, quality, and operations requirements. Your typical day will involve reviewing and integrating technical architecture requirements, providing input into final decisions regarding hardware, network products, system software, and security, and utilizing Databricks Unified Data Analytics Platform to deliver impactful data-driven solutions. Roles & Responsibilities6 or more years of experience in implementing data ingestion pipelines from multiple sources and creating end to end data pipeline on Databricks platform. 2 or more years of experience using Python, PySpark or Scala. Experience in Databricks on cloud. Exp in any of AWS, Azure or GCPe, ETL, data engineering, data cleansing and insertion into a data warehouse Must have Skills like Databricks, Cloud Data Architecture, Python Programming Language, Data Engineering. Professional AttributesExcellent writing, communication and presentation skills. Eagerness to learn and develop self on an ongoing basis. Excellent client facing and interpersonal skills. Qualification BE or MCA
Posted 2 months ago
15.0 - 20.0 years
6 - 10 Lacs
Nagpur
Work from Office
Project Role : Full Stack Engineer Project Role Description : Responsible for developing and/or engineering the end-to-end features of a system, from user experience to backend code. Use development skills to deliver innovative solutions that help our clients improve the services they provide. Leverage new technologies that can be applied to solve challenging business problems with a cloud first and agile mindset. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Full Stack Engineer, you will be responsible for developing and engineering the end-to-end features of a system, from user experience to backend code. A typical day involves collaborating with cross-functional teams to design, implement, and optimize innovative solutions that enhance client services. You will leverage new technologies and methodologies to address complex business challenges while maintaining a cloud-first and agile approach. Your role will require you to engage in problem-solving and decision-making processes that drive project success and improve overall system performance. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing and mentoring within the team to enhance overall skill levels.- Continuously evaluate and improve development processes to increase efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies.- Strong understanding of software development life cycle and agile methodologies.- Experience with cloud technologies and services.- Proficient in programming languages such as Java, JavaScript, or Python.- Familiarity with database management systems and data architecture. Additional Information:- The candidate should have minimum 7.5 years of experience in Data Modeling Techniques and Methodologies.- This position is based at our Nagpur office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 months ago
3.0 - 8.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also engage in problem-solving discussions with your team, providing guidance and support to ensure successful project outcomes. Additionally, you will monitor project progress, address any challenges that arise, and facilitate communication among team members to maintain alignment and efficiency throughout the project lifecycle. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate knowledge sharing sessions to enhance team capabilities.- Mentor junior team members to foster their professional growth. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with data governance and compliance standards.- Ability to work with large datasets and perform data analysis. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 months ago
8.0 - 13.0 years
15 - 19 Lacs
Coimbatore
Work from Office
Project Role : Technology Architect Project Role Description : Review and integrate all application requirements, including functional, security, integration, performance, quality and operations requirements. Review and integrate the technical architecture requirements. Provide input into final decisions regarding hardware, network products, system software and security. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Cloud Data ArchitectureMinimum 7.5 year(s) of experience is required Educational Qualification : BE or MCA Summary :As a Technology Architect, you will be responsible for reviewing and integrating all application requirements, including functional, security, integration, performance, quality, and operations requirements. Your typical day will involve reviewing and integrating technical architecture requirements, providing input into final decisions regarding hardware, network products, system software, and security, and utilizing Databricks Unified Data Analytics Platform to deliver impactful data-driven solutions. Roles & ResponsibilitiesShould have a minimum of 8 years of experience in Databricks Unified Data Analytics Platform. Should have strong educational background in technology and information architectures, along with a proven track record of delivering impactful data-driven solutions. Strong requirement analysis and technical solutioning skill in Data and Analytics Client facing role in terms of running solution workshops, client visits, handled large RFP pursuits and managed multiple stakeholders. Technical Experience10 or more years of experience in implementing data ingestion pipelines from multiple sources and creating end to end data pipeline on Databricks platform. 3 or more years of experience using Python, PySpark or Scala. Experience in Databricks on cloud. Exp in any of AWS, Azure or GCPe, ETL, data engineering, data cleansing and insertion into a data warehouse Must have Skills like Databricks, Cloud Data Architecture, Python Programming Language, Data Engineering. Professional AttributesExcellent writing, communication and presentation skills. Eagerness to learn and develop self on an ongoing basis. Excellent client facing and interpersonal skills. Qualification BE or MCA
Posted 2 months ago
5.0 - 10.0 years
15 - 19 Lacs
Coimbatore
Work from Office
Project Role : Technology Architect Project Role Description : Review and integrate all application requirements, including functional, security, integration, performance, quality and operations requirements. Review and integrate the technical architecture requirements. Provide input into final decisions regarding hardware, network products, system software and security. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Cloud Data ArchitectureMinimum 5 year(s) of experience is required Educational Qualification : BE or MCA Summary :As a Technology Architect, you will review and integrate all application requirements, including functional, security, integration, performance, quality, and operations requirements. You will also review and integrate the technical architecture requirements and provide input into final decisions regarding hardware, network products, system software, and security. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead the design and implementation of technology solutions.- Develop technical specifications and architecture designs.- Ensure compliance with architectural standards and guidelines.- Conduct technology evaluations and provide recommendations. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with Cloud Data Architecture.- Strong understanding of data architecture principles.- Experience in designing and implementing data solutions.- Knowledge of cloud-based data technologies.- Ability to analyze complex technical requirements. Additional Information:- The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Pune office.- A BE or MCA is required. Qualification BE or MCA
Posted 2 months ago
12.0 - 15.0 years
10 - 14 Lacs
Hyderabad
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Ab Initio Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that project goals are met, facilitating discussions to address challenges, and guiding the team in implementing effective solutions. You will also engage in strategic planning and decision-making processes, ensuring that the applications developed align with organizational objectives and meet user needs. Your role will require you to balance technical expertise with leadership skills, fostering a collaborative environment that encourages innovation and efficiency. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate training and development opportunities for team members to enhance their skills.- Monitor project progress and ensure timely delivery of milestones. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Strong understanding of data integration and ETL processes.- Experience with performance tuning and optimization of data processing workflows.- Familiarity with data governance and data quality best practices.- Ability to troubleshoot and resolve technical issues in a timely manner. Additional Information:- The candidate should have minimum 12 years of experience in Ab Initio.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 months ago
10.0 - 15.0 years
22 - 37 Lacs
Bengaluru
Work from Office
Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Join Kyndryl as a Data Architect where you will unlock the power of data to drive strategic decisions and shape the future of our business. As a key member of our team, you will harness your expertise in basic statistics, business fundamentals, and communication to uncover valuable insights and transform raw data into rigorous visualizations and compelling stories. In this role, you will have the opportunity to work closely with our customers as part of a top-notch team. You will dive deep into vast IT datasets, unraveling the mysteries hidden within, and discovering trends and patterns that will revolutionize our customers' understanding of their own landscapes. Armed with your advanced analytical skills, you will draw compelling conclusions and develop data-driven insights that will directly impact their decision-making processes. Your Role and Responsibilities: Data Architecture Design: Design scalable, secure, and high-performance data architectures, including data warehouses, data lakes, and BI solutions. Data Modeling: Develop and maintain complex data models (ER, star, and snowflake schemas) to support BI and analytics requirements. BI Strategy and Implementation: Lead the design and implementation of BI solutions using platforms like Power BI, Tableau, Qlik, and Looker. ETL/ELT Management: Architect efficient ETL/ELT pipelines for data transformation and integration across multiple data sources. Data Governance: Implement data quality, data lineage, and metadata management frameworks to ensure data reliability and compliance. Performance Optimization: Optimize data storage and retrieval processes for speed, scalability, and efficiency. Stakeholder Collaboration: Work closely with business and technical teams to define data requirements and deliver actionable insights. Cloud and Big Data: Utilize cloud-native tools like Azure Synapse, AWS Redshift, GCP BigQuery, and Databricks for large-scale data processing. Mentorship: Guide junior data engineers and BI developers on best practices and advanced techniques. Your unique ability to communicate and empathize with stakeholders will be invaluable. By understanding the business objectives and success criteria of each project, you will align your data analysis efforts seamlessly with our overarching goals. With your mastery of business valuation, decision-making, project scoping, and storytelling, you will transform data into meaningful narratives that drive real-world impact. At Kyndryl, we believe that data holds immense potential, and we are committed to helping you unlock that potential. You will have access to vast repositories of data, empowering you to delve deep to determine root causes of defects and variation. By gaining a comprehensive understanding of the data and its specific purpose, you will be at the forefront of driving innovation and making a difference. If you are ready to unleash your analytical ability, collaborate with industry experts, and shape the future of data-driven decision making, then join us as a Data Analyst at Kyndryl. Together, we will harness the power of data to redefine what is possible and create a future filled with limitless possibilities. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Skills and Experience Education: Bachelor's or master’s in computer science, Data Science, or a related field. Experience: 8+ years in data architecture, BI, and analytics roles. BI Tools: Power BI, Tableau, Qlik, Looker, SAP Analytics Cloud. Data Modeling: ER, dimensional, star, and snowflake schemas. Cloud Platforms: Azure, AWS, GCP, Snowflake. Databases: SQL Server, Oracle, MySQL, NoSQL (MongoDB, DynamoDB). ETL Tools: Informatica, Talend, SSIS, Apache Nifi. Scripting: Python, R, SQL, DAX, MDX. Soft Skills: Strong communication, problem-solving, and leadership abilities. Knowledge of deployment patterns. Strong documentation, troubleshooting, and data profiling skills. Excellent analytical, conceptual, and problem-solving abilities. Ability to manage multiple priorities and swiftly adapt to changing demands. Preferred Skills and Experience Microsoft Certified: Azure Data Engineer Associate AWS Certified Data Analytics - Specialty Google Professional Data Engineer Tableau Desktop Certified Professional Power BI Data Analyst Associate Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.
Posted 2 months ago
7.0 - 12.0 years
9 - 14 Lacs
Bengaluru
Work from Office
The Group You ll Be A Part Of Global Information Systems - Enterprise Analytics and Engineering The Impact You ll Make As a lead Data Engineer in Enterprise Analytics group at Lam Research you will be responsible for designing and leading the data architecture and integration for multiple projects You will be writing the specs and designing the solutions and coordinate with the development team to support the business needs and collaborate with data architecture, business, and development teams. You will also guide application and business development teams in the analytics design of complex solutions and ensure that they are in alignment with the COE data principles, standards, strategies What You ll Do Design advanced Analytics applications using Microsoft Fabric , Pyspark , SparkSQL , Power BI, Synapse, SQL Warehouses including ETL pipelines using Microsoft Azure Platform Lead, analyze, design, and deliver big data centric analytics solutions and applications including statistical data models, reports, and dashboards in Microsoft/Azure tech stack Use leadership abilities to drive cross functional development on new solutions from design through delivery. Define, maintain, and promote the advanced analytics standards and processes as part of GIS Center of Excellence across the company Work with key stakeholders in product management and leadership to define strategy and requirements. Work with US teams to deliver supporting software and services in sync with launch timelines. Offer insight, guidance, and direction on the usage of emerging data analytics trends and technical capabilities Who We re Looking For Data Engineering lead with 7+ years of experience in Databricks / Pyspark Minimum of 8 years of related experience with a Bachelor s degree in computer science or equivalent; or 6 years and a Master s degree; or a PhD with 3 years experience; or equivalent experience. Preferred Qualifications 7+ years of strong experience in Databrics / Pyspark 7+ years of strong T-SQL experience 5+ years of experience in Designing, building, and launching extremely efficient & reliable data pipelines for movement of data to and from disparate source systems 2+ years of experience in working with Microsoft Azure and strong knowledge about ADLS, ADF, Data Bricks, SQL WH etc. 4+ years of building and launching new data models that provide intuitive analytics for the analysts and customers Must have strong experience in Data Warehouse ETL design and development, methodologies, tools, processes and best practices Experience in working with Power BI Excellent query writing skill and communication skills preferred Familiarity with common API s: REST, SOAP preferred Our Commitment We believe it is important for every person to feel valued, included, and empowered to achieve their full potential. By bringing unique individuals and viewpoints together, we achieve extraordinary results. Lam Research ("Lam" or the "Company") is an equal opportunity employer. Lam is committed to and reaffirms support of equal opportunity in employment and non-discrimination in employment policies, practices and procedures on the basis of race, religious creed, color, national origin, ancestry, physical disability, mental disability, medical condition, genetic information, marital status, sex (including pregnancy, childbirth and related medical conditions), gender, gender identity, gender expression, age, sexual orientation, or military and veteran status or any other category protected by applicable federal, state, or local laws. It is the Companys intention to comply with all applicable laws and regulations. Company policy prohibits unlawful discrimination against applicants or employees. Lam offers a variety of work location models based on the needs of each role. Our hybrid roles combine the benefits of on-site collaboration with colleagues and the flexibility to work remotely and fall into two categories - On-site Flex and Virtual Flex. On-site Flex you ll work 3+ days per week on-site at a Lam or customer/supplier location, with the opportunity to work remotely for the balance of the week. Virtual Flex you ll work 1-2 days per week on-site at a Lam or customer/supplier location, and remotely the rest of the time.
Posted 2 months ago
4.0 - 9.0 years
15 - 30 Lacs
Gurugram, Chennai
Work from Office
Role & responsibilities • Assume ownership of Data Engineering projects from inception to completion. Implement fully operational Unified Data Platform solutions in production environments using technologies like Databricks, Snowflake, Azure Synapse etc. Showcase proficiency in Data Modelling and Data Architecture Utilize modern data transformation tools such as DBT (Data Build Tool) to streamline and automate data pipelines (nice to have). Implement DevOps practices for continuous integration and deployment (CI/CD) to ensure robust and scalable data solutions (nice to have). Maintain code versioning and collaborate effectively within a version-controlled environment. Familiarity with Data Ingestion & Orchestration tools such as Azure Data Factory, Azure Synapse, AWS Glue etc. Set up processes for data management, templatized analytical modules/deliverables. Continuously improve processes with focus on automation and partner with different teams to develop system capability. Proactively seek opportunities to help and mentor team members by sharing knowledge and expanding skills. Ability to communicate effectively with internal and external stakeholders. Coordinating with cross-functional team members to make sure high quality in deliverables with no impact on timelines Preferred candidate profile • Expertise in computer programming languages such as: Python and Advance SQL • Should have working knowledge of Data Warehousing, Data Marts and Business Intelligence with hands-on experience implementing fully operational data warehouse solutions in production environments. • 3+ years of Working Knowledge of Big data tools (Hive, Spark) along with ETL tools and cloud platforms. • 3+ years of relevant experience in either Snowflake or Databricks. Certification in Snowflake or Databricks would be highly recommended. • Proficient in Data Modelling and ELT techniques. • Experienced with any of the ETL/Data Pipeline Orchestration tools such as Azure Data Factory, AWS Glue, Azure Synapse, Airflow etc. • Experience working with ingesting data from different data sources such as RDBMS, ERP Systems, APIs etc. • Knowledge of modern data transformation tools, particularly DBT (Data Build Tool), for streamlined and automated data pipelines (nice to have). • Experience in implementing DevOps practices for CI/CD to ensure robust and scalable data solutions (nice to have). • Proficient in maintaining code versioning and effective collaboration within a versioncontrolled environment. • Ability to work effectively as an individual contributor and in small teams. Should have experience mentoring junior team members. • Excellent problem-solving and troubleshooting ability with experience of supporting and working with cross functional teams in a dynamic environment. • Strong verbal and written communication skills with ability to communicate effectively, articulate results and issues to internal and client team.
Posted 2 months ago
3.0 - 6.0 years
6 - 10 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
About KPI Partners. KPI Partners is a leading provider of data analytics solutions, dedicated to helping organizations transform data into actionable insights. Our innovative approach combines advanced technology with expert consulting, allowing businesses to leverage their data for improved performance and decision-making. Job Description. We are seeking a skilled and motivated Data Engineer with experience in Databricks to join our dynamic team. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and data processing solutions that support our analytics initiatives. You will collaborate closely with data scientists, analysts, and other engineers to ensure the consistent flow of high-quality data across our platforms. Key skills: Python, Pyspark, Databricks, ETL, Cloud (AWS, Azure, or GCP) Key Responsibilities. - Develop, construct, test, and maintain data architectures (e.g., large-scale data processing systems) in Databricks. - Design and implement ETL (Extract, Transform, Load) processes to move and transform data from various sources to target systems. - Collaborate with data scientists and analysts to understand data requirements and design appropriate data models and structures. - Optimize data storage and retrieval for performance and efficiency. - Monitor and troubleshoot data pipelines to ensure reliability and performance. - Engage in data quality assessments, validation, and troubleshooting of data issues. - Stay current with emerging technologies and best practices in data engineering and analytics. Qualifications. - Bachelor's degree in Computer Science, Engineering, Information Technology, or related field. - Proven experience as a Data Engineer or similar role, with hands-on experience in Databricks. - Strong proficiency in SQL and programming languages such as Python or Scala. - Experience with cloud platforms (AWS, Azure, or GCP) and related technologies. - Familiarity with data warehousing concepts and data modeling techniques. - Knowledge of data integration tools and ETL frameworks. - Strong analytical and problem-solving skills. - Excellent communication and teamwork abilities. Why Join KPI Partners? - Be part of a forward-thinking team that values innovation and collaboration. - Opportunity to work on exciting projects across diverse industries. - Continuous learning and professional development opportunities. - Competitive salary and benefits package. - Flexible work environment with hybrid work options. If you are passionate about data engineering and excited about using Databricks to drive impactful insights, we would love to hear from you! KPI Partners is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.
Posted 2 months ago
10.0 - 15.0 years
16 - 30 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Role : Tech Lead Educational Qualification : ME / BE / MCA / MSc Experience Required : 10+ Years Shifts : Day shift Responsibilities: 10+ years of expertise working in Core Java,J2EE, Design Patterns, Spring, Spring Boot, Micro service architecture Expertise in Web services; SOAP and REST Experience with at least one UI technology like Angular, React JS, OJET Good knowledge in Postgres/ MySQL , Advanced PL SQL with JSON data management Build tools like Ant, Maven and Gradle Experience in Azure Repos/SVN/ Git source control tool is must Solid understanding of Software development life cycle and OOPS concept Working with APIs/Integrations is must Experience working with JDK 17 + is preferred Excellent understanding of architectural principles involved in SaaS and multi-tenant platforms Strong in development tools like Eclipse with SDS, IntelliJ, Git, Cradle, Sonar, Jenkins, Jira/ADO/ Artifactory ORM technologies like Hibernate Experience with Kubernetes and Dockers is preferred Experience of messaging systems and data pipelines such as RabbitMQ and Kafka is preferred Experience using other API Management solutions like Apigee Well versed with scalability, automation, resiliency, high availability and user experience Experience in cloud computing application implementations on AWS is preferred Strong background in creating secure cloud architectures for customer facing applications that are enterprise grade and highly scalable is strongly preferred Experience in Agile, DevOps culture is a plus Should have managed a team of SSE/SE and lead at least 2 projects Good communication skills
Posted 2 months ago
2.0 - 5.0 years
8 - 12 Lacs
Noida
Work from Office
MAQ LLC d.b.a MAQ Software hasmultiple openings at Redmond, WA for: Software Data Operations Engineer (BS+2) Responsible for gathering & analyzing business requirements from customers. Implement,test and integrate software applications for use by customers. Develop &review cost effective data architecture to ensure appropriateness with currentindustry advances in data management, cloud & user experience. Automateuser test scenarios, debug & fix errors in cloud-based data infrastructure,reporting applications to meet customer needs. Must be able to traveltemporarily to client sites and or relocate throughout the United States. Requirements:Bachelors Degree or foreign equivalent in Computer Science, ComputerApplications, Computer Information Systems, Information Technology or relatedfield with two years of work experience in job offered, software engineer, systemsanalyst or related job.
Posted 2 months ago
12.0 - 15.0 years
35 - 50 Lacs
Hyderabad
Work from Office
Skill : Java, Spark, Kafka Experience : 10 to 16 years Location : Hyderabad As Data Engineer, you will : Support in designing and rolling out the data architecture and infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources Identify data source, design and implement data schema/models and integrate data that meet the requirements of the business stakeholders Play an active role in the end-to-end delivery of AI solutions, from ideation, feasibility assessment, to data preparation and industrialization. Work with business, IT and data stakeholders to support with data-related technical issues, their data infrastructure needs as well as to build the most flexible and scalable data platform. With a strong focus on DataOps, design, develop and deploy scalable batch and/or real-time data pipelines. Design, document, test and deploy ETL/ELT processes Find the right tradeoffs between the performance, reliability, scalability, and cost of the data pipelines you implement Monitor data processing efficiency and propose solutions for improvements. • Have the discipline to create and maintain comprehensive project documentation. • Build and share knowledge with colleagues and coach junior profiles.
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40098 Jobs | Dublin
Wipro
19612 Jobs | Bengaluru
Accenture in India
17156 Jobs | Dublin 2
EY
15921 Jobs | London
Uplers
11674 Jobs | Ahmedabad
Amazon
10661 Jobs | Seattle,WA
Oracle
9470 Jobs | Redwood City
IBM
9401 Jobs | Armonk
Accenture services Pvt Ltd
8745 Jobs |
Capgemini
7998 Jobs | Paris,France