Jobs
Interviews

265 Advance Sql Jobs - Page 4

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

9.0 - 14.0 years

6 - 16 Lacs

Pune

Work from Office

Cilicant Private Limited is a fast-growing and innovation-led pharma packaging company, working towards a vision of becoming a fully digitized and lean organization. With SAP B1 at the core of our business systems, integrated with Salesforce and a recently implemented Warehouse Management System (WMS), we are now looking to scale up our IT and automation capabilities across all business functions. We are looking for a Techno-Functional SAP B1 & IT Lead to drive our digital transformation initiatives. The ideal candidate will lead both Business Applications (SAP B1, Salesforce, WMS, custom solutions) and IT Infrastructure & Security, while working closely with cross-functional teams to digitize and automate core business processes. This is a hands-on leadership role with team responsibility. Key Responsibilities 1. SAP B1 & Business Application Management Lead implementation, customization, and optimization of SAP B1 (SQL) modules across departments. Design and manage integrations between SAP B1, Salesforce, WMS, Barcoding Systems, and in-house applications. Develop and maintain add-ons using SAP B1 SDK, SQL queries, and Crystal Reports. Collaborate with business users to identify process gaps and recommend automation and digital solutions. 2. Application Development & Automation Develop in-house web and desktop applications using .NET technologies to support unique business needs. Identify automation opportunities across Production, Finance, Stores, Quality, Purchase, Sales, and Dispatch. Lead automation initiatives with external vendors or internal developers, ensuring security and scalability. 3. IT Infrastructure & Cybersecurity Oversee the company's IT architecture, including server setup, cloud/data storage, networks, and endpoints. Define and implement IT policies for access, security, disaster recovery, procurement, and asset management. Ensure cybersecurity by implementing tools like firewalls, endpoint security, and user access control. 4. User Support & Training Lead a support structure for all business users using SAP B1, Salesforce, WMS, and in-house systems. Create SOPs and conduct user training sessions. Troubleshoot system issues and act as the point of escalation for all IT-related concerns. 5. Documentation & Compliance Maintain updated documentation of configurations, integrations, custom modules, and IT policies. Ensure compliance with industry regulations and internal data protection policies. Stay informed of SAP B1 updates, best practices, and industry trends. Candidate Profile Technical Skills: Proficiency in SAP B1 SDK, SQL (Advanced), .NET (C# / VB.NET) for application development. Experience in integrating SAP B1 with third-party systems like Salesforce and WMS. Experience with Crystal Reports, stored procedures, and database optimization. Good understanding of IT infrastructure, networking, and cybersecurity protocols. Functional Knowledge: Strong understanding of manufacturing business processes: Sales, Purchase, Inventory, Production, Quality, Finance, Dispatch. Experience in process mapping and automation for lean operations. Ability to interpret business needs into technical solutions. Behavioral & Leadership Competencies: Self-driven with a problem-solving mindset. Ability to manage cross-functional teams and external vendors. Strong project management and execution capability. Excellent communication skills to collaborate with business and tech stakeholders.

Posted 1 month ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Gurugram

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Data Analytics Good to have skills : Microsoft SQL Server, Python (Programming Language), AWS RedshiftMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary The purpose of the Data Engineering function within the Data and Analytics team is to develop and deliver great data assets and data domain management for our Personal Banking customers and colleagues seamlessly and reliably every time.As a Senior Data Engineer, you will bring expertise on data handling, curation and conformity capabilities to the team; support the design and development of solutions which assist analysis of data to drive tangible business benefit; and assist colleagues in developing solutions that will enable the capture and curation of data for analysis, analytical and/or reporting purposes. The Senior Data Engineer must be experience working as part of an agile team to develop a solution in a complex enterprise. Roles & ResponsibilitiesHands on development experience in Data Warehousing, and or Software DevelopmentExperience utilising tools and practices to build, verify and deploy solutions in the most efficient waysExperience in Data Integration and Data Sourcing activitiesExperience developing data assets to support optimised analysis for customer and regulatory outcomes.Provide ongoing support for platforms as required e.g. problem and incident managementExperience in Agile software development including Github, Confluence, Rally Professional & Technical SkillsExperience with cloud technologies, especially AWS (S3, Redshift, Airflow), DevOps and DataOps tools (Jenkins, Git, Erwin)Advanced SQL and Python userKnowledge of UNIX, Spark and Databricks Additional InformationPosition:Senior Analyst, Data EngineeringReports to:Manager, Data EngineeringDivision:Personal BankGroup:3Industry/domain skills:Some expertise in Retail Banking, Business Banking and or Wealth Management preferred Qualification 15 years full time education

Posted 1 month ago

Apply

6.0 - 11.0 years

8 - 12 Lacs

Pune

Work from Office

What You'll Do We are seeking an experienced Lead Data Engineer with experience in the Data Engineering. We are looking for a background in ETL processes, data warehousing, data modeling, and hands-on expertise in SQL and Python. The ideal candidate will have exposure to cloud technologies and will play a key role in designing and managing scalable, high-performance data systems that support marketing and sales insights. You will report to Manager- Data engineering What Your Responsibilities Will Be You will Design, develop, and maintain efficient ETL pipelines using DBT,Airflow to move and transform data from multiple sources into a data warehouse. You will Lead the development and optimization of data models (e.g., star, snowflake schemas) and data structures to support reporting. You will Leverage cloud platforms (e.g., AWS, Azure, Google Cloud) to manage and scale data storage, processing, and transformation processes. You will Work with business teams, marketing, and sales departments to understand data requirements and translate them into actionable insights and efficient data structures. You will Use advanced SQL and Python skills to query, manipulate, and transform data for multiple use cases and reporting needs. You will Implement data quality checks and ensure that the data adheres to governance best practices, maintaining consistency and integrity across datasets. You will Experience using Git for version control and collaborating on data engineering projects. What You'll Need to be Successful Bachelor's degree with 6+ years of experience in Data Engineering. ETL/ELT Expertise : experience in building, improving ETL/ELT processes. Data Modeling : experience with designing and implementing data models such as star and snowflake schemas, and working with denormalized tables to optimize reporting performance. Experience with cloud-based data platforms (AWS, Azure, Google Cloud) SQL and Python Proficiency : Advanced SQL skills for querying large datasets and Python for automation, data processing, and integration tasks. DBT Experience : Hands-on experience with DBT (Data Build Tool) for transforming and managing data models. Good to have Skills: Familiarity with AI concepts such as machine learning (ML), (NLP), and generative AI. Work with AI-driven tools and models for data analysis, reporting, and automation. Oversee and implement DBT models to improve the data transformation process. Experience in the marketing and sales domain, with lead management, marketing analytics, and sales data integration. Familiarity with business intelligence reporting tools, Power BI, for building data models and generating insights.

Posted 1 month ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Project description An excellent opportunity for personal development in a dynamic environment. You will join a highly skilled and dynamic team supporting Murex applications in the UK and our global practice focused on Application installation support around the world. We are one of the largest Murex partners and offer a wide range of opportunities in the region. There are good opportunities to develop in different areas. The team is highly skilled and will provide a great opportunity to expand your knowledge. Responsibilities Act as the subject matter expert for datamart and integration ensuring that all functionality of the product are installed and leveraged to its best capability Technical Analysis of changes, solution design, development/configuration and unit testing of MxML workflows and datamart Analysis & Documentation of user requirements and transpose into Functional Specifications Define the systems and data requirements and validate the systems design and processes from functional and technical aspects End to end ownership of tasks in cooperation with Business Analysts and Testing team. Contribute to the User Training activities, through one-to-one discussion, preparation of user training guides & presentations Follow up with vendor support as and when necessary to resolve bugs/issues Ensure technical and functional hand over of the project and changes to the relevant teams Participate in fixing production and test defects SkillsMust have 4+ years of Murex Development experience Experience working in the financial industry with relevant experience in business analysis and project implementation. Experience in managing and delivery of trading platforms for Treasury products on a global scale, integrated within the organizations treasury product systems. Strong team player with excellent communication & inter-personal skills. Strong problem solver who can question and understand proposed solutions and business drivers. Strong organizational and leadership skills Strong understanding of treasury products and experience in back office projects. Good knowledge of the different post-trade interactions between the various actors of capital markets including service providers Advanced MxML workflow and formulae development Strong datamart knowledge Advanced SQL Good general financial market understanding Knowledge of pre trade framework along with MSL scripting language Unix Nice to have Experience in other Murex modules Locations-PUNE,BANGALORE,HYDERABAD,CHENNAI,NOIDA

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Bengaluru

Work from Office

As a Data Scientist at IBM, you will uncover and transform insights into creative experiences and business value that matter to our clients. You will analyze data, apply your business knowledge to analyze client business issues, formulate hypotheses and test conclusions to determine appropriate solutions, communicate outcomes, and collaborate on solution development. Our Marketing, Communications & Corporate Social Responsibility (MCC) team is responsible for positioning IBM in the market. We define and optimize IBM’s brand, capture the market’s attention, and articulate our point of view for clients, partners, the media, and even other IBMers. As part of our team, you’ll be surrounded by bright minds and keen collaborators - always willing to help and be helped - as you apply passion to work that will compel our audience to choose IBM’s products and services. As a Marketing Data Scientist & AI Professional, you’ll work collaboratively, as part of a team, on a project that addresses a strategic IBM Marketing business challenge. This role supports our Performance Intelligence team, which is responsible for building intelligence that powers and orchestrates performance across tactics and buyer groups. As a member of the team, you may: Develop scalable analytical solutions that provide data-driven and optimization insights Work with large, complex data sets and extract knowledge or insights to solve difficult, non-routine analysis problems, applying advanced analytical methods as needed Conduct end-to-end analysis that includes industry research, data gathering and requirements specification, processing, analysis, ongoing deliverables, and presentations Communicate informed conclusions and recommendations across the organization's leadership structure Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 3+ years of relevant data science work experience in complex data querying environments, working with complex data models using Advanced SQL/Python or other query and programming tools to process and analyze data. Advanced knowledge and experience working with large data sets and applying data mining / predictive modeling techniques to extract meaningful insights. Thought leadership in working on functional objectives and shaping a solution. Ability to translate business requirements into technical solution. Familiarity with Microservices architecture, DevOps, deployment processes, and cloud platforms AWS, Azure, IBM Cloud, or Google Cloud. Preferred technical and professional experience Graduate degree in a quantitative discipline (e.g., statistics, data science, computer science, behavioral science, applied mathematics, operations research) or another discipline involving experimental design and quantitative analysis of data is a plus. Experience with statistical analysis such as linear models, multivariate analysis, clustering, time series, mixed model, and Bayesian methods. Relevant work experience in marketing analytics and web analytics is a plus.

Posted 1 month ago

Apply

7.0 - 8.0 years

9 - 10 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) What do you need for this opportunity? Must have skills required: Gen AI, AWS data stack, Kinesis, open table format, PySpark, stream processing, Kafka, MySQL, Python MatchMove is Looking for: Technical Lead - Data Platform - Data, you will architect, implement, and scale our end-to-end data platform built on AWS S3, Glue, Lake Formation, and DMS. You will lead a small team of engineers while working cross-functionally with stakeholders from fraud, finance, product, and engineering to enable reliable, timely, and secure data access across the business. You will champion best practices in data design, governance, and observability, while leveraging GenAI tools to improve engineering productivity and accelerate time to insight. You will contribute to Owning the design and scalability of the data lake architecture for both streaming and batch workloads, leveraging AWS-native services. Leading the development of ingestion, transformation, and storage pipelines using AWS Glue, DMS, Kinesis/Kafka, and PySpark. Structuring and evolving data into OTF formats (Apache Iceberg, Delta Lake) to support real-time and time-travel queries for downstream services. Driving data productization, enabling API-first and self-service access to curated datasets for fraud detection, reconciliation, and reporting use cases. Defining and tracking SLAs and SLOs for critical data pipelines, ensuring high availability and data accuracy in a regulated fintech environment. Collaborating with InfoSec, SRE, and Data Governance teams to enforce data security, lineage tracking, access control, and compliance (GDPR, MAS TRM). Using Generative AI tools to enhance developer productivity including auto-generating test harnesses, schema documentation, transformation scaffolds, and performance insights. Mentoring data engineers, setting technical direction, and ensuring delivery of high-quality, observable data pipelines. Responsibilities:: Architect scalable, cost-optimized pipelines across real-time and batch paradigms, using tools such as AWS Glue, Step Functions, Airflow, or EMR. Manage ingestion from transactional sources using AWS DMS, with a focus on schema drift handling and low-latency replication. Design efficient partitioning, compression, and metadata strategies for Iceberg or Hudi tables stored in S3, and cataloged with Glue and Lake Formation. Build data marts, audit views, and analytics layers that support both machine-driven processes (e.g. fraud engines) and human-readable interfaces (e.g. dashboards). Ensure robust data observability with metrics, alerting, and lineage tracking via OpenLineage or Great Expectations. Lead quarterly reviews of data cost, performance, schema evolution, and architecture design with stakeholders and senior leadership. Enforce version control, CI/CD, and infrastructure-as-code practices using GitOps and tools like Terraform. Requirements At-least 7 years of experience in data engineering. Deep hands-on experience with AWS data stack: Glue (Jobs & Crawlers), S3, Athena, Lake Formation, DMS, and Redshift Spectrum Expertise in designing data pipelines for real-time, streaming, and batch systems, including schema design, format optimization, and SLAs. Strong programming skills in Python (PySpark) and advanced SQL for analytical processing and transformation. Proven experience managing data architectures using open table formats (Iceberg, Delta Lake, Hudi) at scale . Understanding of stream processing with Kinesis/Kafka and orchestration via Airflow or Step Functions. Experience implementing data access controls, encryption policies, and compliance workflows in regulated environments. Ability to integrate GenAI tools into data engineering processes to drive measurable productivity and quality gains with strong engineering hygiene. Demonstrated ability to lead teams, drive architectural decisions, and collaborate with cross-functional stakeholders. Brownie Points: Experience working in a PCI DSS or any other central bank regulated environment with audit logging and data retention requirements. Experience in the payments or banking domain, with use cases around reconciliation, chargeback analysis, or fraud detection. Familiarity with data contracts, data mesh patterns, and data as a product principles. Experience using GenAI to automate data documentation, generate data tests, or support reconciliation use cases. Exposure to performance tuning and cost optimization strategies in AWS Glue, Athena, and S3. Experience building data platforms for ML/AI teams or integrating with model feature stores. Engagement Model:: Direct placement with client This is remote role Shift timings::10 AM to 7 PM

Posted 1 month ago

Apply

7.0 - 12.0 years

30 - 45 Lacs

Gurugram

Hybrid

Exp: 7 to 15 years Location: Only Gurgaon Notice: Immediate only..!! Key Skills: SQL, Advance SQL, BI tools etc. Role and Responsibilities: - Primary day-to-day client contacts and often interact with C-level executives. - Is primary client day-to-day contact & ensures client needs are well understood, defined & met - Serves as the primary interface between senior client management and senior leadership (VPs and SVPs) - Interacts regularly with clients to understand business requirements, define analytical problems, structure and communicate solutions and ensure client satisfaction with strong business driving results - Manages / leads 1-2 engagement/ projects with different team structures and service delivery models. Responsible for driving revenue generating from these accounts - Leads project teams of 3-6 consultants or more, and/ or Team Lead and Analysts in all aspects of project execution - Plays critical role in defining the problem, structuring the solution, and executing against it - Clearly defines project deliverables, timelines and methodology laying out the project plan - Owns the execution of the project, with on time delivery every time, ensuring all project goals are met - Manages team members, including definition of objectives, oversight of execution and evaluation of performance - Provides thought leadership and delivers business insights to identify and resolve complex issues critical to their clients' success - Candidate Profile: - 7+ years of experience comprising analytics service delivery, consulting, solution design and client management - Experience and strong knowledge of the analytics service industry with focus on B2C & Omni-channel retail across customer lifecycle and customer journey - Demonstrable leadership ability, superior problem solving and people management skills - Excellent listening, written communication and presentation skills - Experience / exposure to marketing, operations, sales, merchandizing and supply chain, retail strategy, project management, cost reduction, and business development for Retail and Media organizations - Master's degree in management, data science, economics, mathematics, operations research or related analytics areas; candidates with BA/BS degrees in the same fields from the top tier academic institutions are also welcome to apply

Posted 1 month ago

Apply

2.0 - 7.0 years

15 - 30 Lacs

Bengaluru

Work from Office

About the Team As Business Analysts, its on us to dive into data and derive insights from it. These then become actionable solutions in the form of changes, improvements, upgrades and new features. As a Business Analyst at Meesho, you will play a crucial role in identifying, improving, and developing technology solutions that drive our strategic goals. This is a tremendous opportunity to learn about high-priority initiatives and collaborate with colleagues throughout the firm and across teams. We work at the intersection of business and technology, continuously developing our leadership, management and communication skills in the process. The exact team you will be working with will be decided during or after the hiring process. Regardless, you are sure to learn and grow and have fun doing so too. Each of our teams at Meesho has its own fun rituals from casual catch-ups to bar hopping, movies nights, and games. About the Role As a Senior Business Analyst, you will work on improving the reporting tools, methods, and processes of the team you are assigned to. You will also create and deliver weekly, monthly, and quarterly metrics critical for tracking and managing the business. You will manage numerous requests concurrently and strategically, prioritising them when necessary. You will actively engage with internal partners throughout the organisation to meet and exceed customer service levels and transport-related KPIs. You will brainstorm simple, scalable solutions to difficult problems, and seamlessly manage projects under your purview. You will maintain excellent relationships with our users and in fact, advocate for them while keeping in mind the business goals of your team. What you will do Create various algorithms for optimizing demand and supply data Conduct analysis and solution-building based on insights captured from data Give insights to management and help in strategic planning Analyze metrics, key indicators and other available data sources to discover root causes of process defects Support business development and help to create efficient designs and solution processes Determine efficient utilization of resources Research and implement cost reduction opportunities Must have skills /MBA in any discipline 2+ years of experience as a Business Analyst Proficiency in Advanced Excel and Advanced SQL (must-have) and Python(must have) Understanding of basic statistics and probability concepts Proven problem-solving skills

Posted 1 month ago

Apply

3.0 - 8.0 years

55 - 60 Lacs

Bengaluru

Work from Office

Our vision for the future is based on the idea that transforming financial lives starts by giving our people the freedom to transform their own. We have a flexible work environment, and fluid career paths. We not only encourage but celebrate internal mobility. We also recognize the importance of purpose, well-being, and work-life balance. Within Empower and our communities, we work hard to create a welcoming and inclusive environment, and our associates dedicate thousands of hours to volunteering for causes that matter most to them. Chart your own path and grow your career while helping more customers achieve financial freedom. Empower Yourself. Job Summary: At Empower, a Sr. Architect is a mix of leadership position and thought leadership role. A Sr. Architect works with enterprise architects, and both business and IT teams, to align solutions to the technology vision they help create. This role supports enterprise Architects in the development of technology strategies, reference architectures, solutions, best practices and guidance across the entire IT development organization; all the while addressing total cost of ownership, stability, performance and efficiency. The candidate will also be working with Empower Innovation Lab team as the team is experimenting with emerging technologies, such as Generative AI, and Advanced Analytics. In this rapid paced environment, the person must possess a "can-do" attitude while demonstrating a strong work ethic. This person should have a strong aptitude to help drive decisions. He or she will be actively involved in influencing the strategic direction of technology at Empower Retirement. There will be collaboration across all teams including IT Infrastructure, PMO office, Business, and third-party integrators in reviewing, evaluating, designing and implementing solutions. The Architect must understand available technology options and educate and influence technology teams to leverage them where appropriate. The Architect will recognize and propose alternatives, make recommendations, and describe any necessary trade-offs. In some cases, particularly on key initiatives, the Architect will participate on the design and implementation of end-to-end solutions directly with development teams. The ideal candidate will leverage their technical leadership/direction-setting skills with the development organization to be able to prove technical concepts quickly using a variety of tools, methods, & frameworks. Responsibilities: Help Enterprise Architect, work with peer Sr. Architects and more junior resources to define and execute on the business aligned IT strategy and vision. Develop, document, and provide input into the technology roadmap for Empower. Create reference architectures that demonstrate an understanding of technology components and the relationships between them. Design and modernize complex systems into cloud compatible or cloud native applications where applicable. Create strategies and designs for migrating applications to cloud systems. Participate in the evaluation of new applications, technical options, challenge the status quo, create solid business cases, and influence direction while establishing partnerships with key constituencies. Implement best practices, standards & guidance, then subsequently provide coaching of technology team members. Make leadership recommendations regarding strategic architectural considerations related to process design and process orchestration. Provide strong leadership and direction in development/engineering practices. Collaborate with other business and technology teams on architecture and design issues. Respond to evolving and changing security conditions. Implement and recommend security guidelines. Provide thought-leadership, advocacy, articulation, assurance, and maintenance of the enterprise architecture discipline. Provide solution, guidance, and implementation assistance within full stack development teams. Recommend long term scalable and performant architecture changes keeping cost in control. Preferred Qualifications: 12+ years of experience in the development and delivery of data systems. This experience should be relevant to roles such as Data Analyst, ETL (Extract, Transform and Load) Developer (Data Engineer), Database Administrator (DBA), Business Intelligence Developer (BI Engineer), Machine Learning Developer (ML Engineer), Data Scientist, Data Architect, Data Governance Analyst, or a managerial position overseeing any of these functions. 3+ years of experience creating solution architectures and strategies across multiple architecture domains (business, application, data, integration, infrastructure and security). Solid experience with the following technology disciplines: Python, Cloud architectures, AWS (Amazon Web Services), Bigdata (300+TBs), Advanced Analytics, Advance SQL Skills, Data Warehouse systems(Redshift or Snowflake), Advanced Programming, NoSQL, Distributed Computing, Real-time streaming Nice to have experience in Java, Kubernetes, Argo, Aurora, Google Analytics, META Analytics, Integration with 3rd party APIs, SOA & microservices design, modern integration methods (API gateway/web services, messaging & RESTful architectures). Familiarity with BI tools such as Tableau/QuickSight. Experience with code coverage tools. Working knowledge of addressing architectural cross cutting concerns and their tradeoffs, including topics such as caching, monitoring, operational surround, high availability, security, etc. Demonstrates competency applying architecture frameworks and development methods. Understanding of business process analysis and business process management (BPM). Excellent written and verbal communication skills. Experience in mentoring junior team members through code reviews and recommend adherence to best practices. Experience working with global, distributed teams. Interacts with people constantly, demonstrating strong people skills. Able to motivate and inspire, influencing and evangelizing a set of ideals within the enterprise. Requires a high degree of independence, proactively achieving objectives without direct supervision. Negotiates effectively at the decision-making table to accomplish goals. Evaluates and solves complex and unique problems with strong problem-solving skills. Thinks broadly, avoiding tunnel vision and considering problems from multiple angles. Possesses a general understanding of the wealth management industry, comprehending how technology impacts the business. Stays on top of the latest technologies and trends through continuous learning, including reading, training, and networking with industry colleagues. Data Architecture - Proficiency in platform design and data architecture, ensuring scalable, efficient, and secure data systems that support business objectives. Data Modeling - Expertise in designing data models that accurately represent business processes and facilitate efficient data retrieval and analysis. Cost Management - Ability to manage costs associated with data storage and processing, optimizing resource usage, and ensuring budget adherence. Disaster Recovery Planning - Planning for data disaster recovery to ensure business continuity and data integrity in case of unexpected events. SQL Optimization/Performance Improvements - Advanced skills in optimizing SQL queries for performance, reducing query execution time, and improving overall system efficiency. CICD - Knowledge of continuous integration and continuous deployment processes, ensuring rapid and reliable delivery of data solutions. Data Encryption - Implementing data encryption techniques to protect sensitive information and ensure data privacy and security. Data Obfuscation/Masking - Techniques for data obfuscation and masking to protect sensitive data while maintaining its usability for testing and analysis. Reporting - Experience with static and dynamic reporting to provide comprehensive and up-to-date information to business users. Dashboards and Visualizations - Creating d ashboards and visualizations to present data in an intuitive and accessible manner, facilitating data-driven insights. Generative AI / Machine Learning - Understanding of generative artificial intelligence and machine learning to develop advanced predictive models and automate decision-making processes. Understanding of machine learning algorithms, deep learning frameworks, and AI model architectures. Understanding of ethical AI principles and practices. Experience implementing AI transparency and explainability techniques. Knowledge of popular RAG frameworks and tools (e.g., LangChain, LlamaIndex). Familiarity with fairness metrics and techniques to mitigate bias in AI models. Sample technologies: Cloud Platforms – AWS (preferred) or Azure or Google Cloud Databases - Oracle, Postgres, MySQL(preferred), RDS, DynamoDB(preferred), Snowflake or Redshift(preferred) Data Engineering (ETL, ELT) - Informatica, Talend, Glue, Python(must), Jupyter Streaming – Kafka or Kinesis CICD Pipeline – Jenkins or GitHub or GitLab or ArgoCD Business Intelligence – Quicksight (preferred), Tableau(preferred), Business Objects, MicroStrategy, Qlik, PowerBI, Looker Advanced Analytics - AWS Sagemaker(preferred), TensorFlow, PyTorch, R, scikit learn Monitoring tools – DataDog(preferred) or AppDynamics or Splunk Bigdata technologies – Apache Spark(must), EMR(preferred) Container Management technologies – Kubernetes, EKS(preferred), Docker, Helm Preferred Certifications: AWS Solution Architect AWS Data Engineer AWS Machine Learning Engineer AWS Machine Learning EDUCATION: Bachelor’s and/or master’s degree in computer science or related field (information systems, mathematics, software engineering) . We are an equal opportunity employer with a commitment to diversity. All individuals, regardless of personal characteristics, are encouraged to apply. All qualified applicants will receive consideration for employment without regard to age, race, color, national origin, ancestry, sex, sexual orientation, gender, gender identity, gender expression, marital status, pregnancy, religion, physical or mental disability, military or veteran status, genetic information, or any other status protected by applicable state or local law.

Posted 1 month ago

Apply

6.0 - 8.0 years

8 - 10 Lacs

Bengaluru

Work from Office

Position Summary: The Data Science team is building the data platform that will enable us to create products and experiences that solve complex and critical problems for our customers. You will be handling a team and will be part of the Data Science team working across cross-functional groups supporting strategic business decisions with insight into OUR lines of service, customers and products. The role will include playing an active role in facilitating data usability, data standardisation and user acceptance testing initiatives; development and deployment of standard and non-standard metrics; creating analytical solutions. You will leverage your technical skills, business acumen, and creativity to extract and analyse massive data sets, build analytics-ready datasets to surface insights and key business metrics, contribute to metadata that improves data usability, and much more. The ideal candidate is deeply analytical and detailed-oriented, but capable of thinking independently and people oriented. Responsibilities: Possesses strong analytical/logical thinking and communication skills Collaborate with the data warehousing team, ensuring that data infrastructure supports the needs of OUR analytics team and validating data quality Coordinate with business planners and decision makers to translate business questions into verifiable hypothesis and data models Develop clear, concise, actionable models and recommendations from mountains of data Advocate for exploration of interesting data anomalies or patterns that may provide more explanatory detail about customer behaviours or predictive value to the business Partner closely with business and technical teams to understand their project objectives and provide data-driven solutions and recommendations Design and develop data preparation components and processes that extract and transform data across disparate databases for reporting and analytics, ensuring integrity of analysis data by developing the requirement specifications and assisting the development and testing of data tables Ensures solutions are scalable, repeatable, efficient and effective Work hands-on on various analytics problems and provide thought leadership on problems that we are working on Interact with onsite team as well as client on daily/weekly basis to gather requirements/provide updates Be involved in the development of the company through pre-sales/operational support Qualifications: 6-8 years of database experience with advanced SQL skills; experience researching and manipulating complex and large data sets (both distributed and non-distributed) should have experience towards leading team Proficient in one or more scripting languages, such as Python or Scala Experience working with complex analytical tools, such as SAS or R Experience with data visualisation tools, such as Tableau/Spot fire/Qlikview Experience in statistical techniques such as Regression, Clustering & Time Series Forecasting, etc. Proven ability to dig-in and understand the data and to leverage creative thinking and problemsolving skills to create new data models Strong understanding with data infrastructure, data warehouse, or data engineering Coordinate with business planners and decision makers to translate business questions into verifiable data models and hypothesis Work with engineers to develop, test, and maintain the accurate tracking, capturing and reporting of key data Knows MS Excel & PowerPoint Bachelors in Engineering or Masters in Statistics/Economics

Posted 1 month ago

Apply

4.0 - 7.0 years

3 - 7 Lacs

Noida

Work from Office

company name=Apptad Technologies Pvt Ltd., industry=Employment Firms/Recruitment Services Firms, experience=4 to 7 , jd= Role IDMC CAI Developer Job Location Remote Job Type FTE JD We are seeking a highly skilled and client-oriented Senior IICS Developer with strong hands-on experience in Informatica Intelligent Cloud Services (IDMC CDI) , Advanced SQL , and ideally Power BI . The ideal candidate will have prior experience working with US-based clients , possess strong production support exposure , and demonstrate excellent presentation and client interaction skills. Candidates with a background in Big 5 consulting environments are highly preferred. Key Responsibilities: Design, develop, and implement robust data integration solutions using Informatica IICS CDI . Write and optimize complex SQL queries for data extraction, transformation, and analysis. Work closely with business stakeholders and technical teams to gather requirements and deliver client-focused solutions. Provide production support for data integration pipelines and address incidents promptly. Collaborate with teams to ensure adherence to best practices, performance optimization, and code quality. (Nice to have) Develop insightful dashboards and visualizations using Power BI . Interact directly with US-based clients; participate in meetings and ensure professional, timely communication. Document solutions and provide knowledge transfer as needed. : 4+ years of experience in ETL/Data Integration , with at least 3+ years on Informatica IICS (IDMC CDI) . Strong proficiency in Advanced SQL and relational database design. Experience in production support environments and troubleshooting data integration jobs. Prior experience working with US clients and navigating stakeholder interactions effectively. Excellent communication and presentation skills. Highly client-focused and adaptable in dynamic environments. Big 5 consulting experience (e.g., Deloitte, PwC, EY, KPMG, Accenture) is a strong plus. (Preferred) Hands-on experience with Power BI or other data visualization tools. , Title=IDMC CDI Developer, ref=6566238

Posted 2 months ago

Apply

6.0 - 8.0 years

3 - 7 Lacs

Noida

Work from Office

company name=Apptad Technologies Pvt Ltd., industry=Employment Firms/Recruitment Services Firms, experience=6 to 8 , jd= Job Title:- Data Engineer Job Location:- Remote Job Type:- full time :- Apptad is looking for a Data Engineer profile. Its a full-time/long term job opportunity with us. Candidate should have Advance Python and Advance SQL and Pyspark. Python Programming Language: LevelAdvanced Key ConceptsMulti-threading, Multi-Processing, Regular Expressions, Exception Handling, etc. LibrariesPandas, Numpyetc. Data Modelling and Data Transformation: LevelAdvanced Key AreasData processing on structured and unstructured data. Relational Databases: LevelAdvanced Key AreasQuery Optimization, Query Building, Experience with ORMs like SQLAlchemy, Exposure to databases such as MSSQL, Postgres, Oracle, etc. Functional and Object-Oriented Programming (OOPS): LevelIntermediate Problem Solving for Feature Development: LevelIntermediate Good experience working with AWS Cloud and its services related to Data engineering like Athena, AWS batch jobs etc. , Title=Data Engineer, ref=6566581

Posted 2 months ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Role & responsibilities Prepare technical reports by collecting, analyzing and summarizing information and trends Creating and maintaining dashboards with operational and business metrics Prepare Business Review Presentations - perform daily, weekly and monthly reviews of current processes Supporting the continuous optimization of operating models and reporting practices Improving operational systems by analyzing current practices, including designing and writing potential modifications and support in their maintenance. Conducting ad hoc analysis to investigate ongoing or one-time operational issues and communicate them effectively to all relevant stakeholders. Preferred candidate profile Minimum of 2 years of experience as a Data Analyst/MIS Analyst or similar roles Able to adapt quickly to changes in workflows and a team-player who can also work independently Excellent organizational skills and detail-oriented approach to problem solving Excellent English communication (verbal and written), including the ability to clearly communicate in a dynamic environment across all levels Able to prioritize and manage tasks efficiently Understanding of business processes and improvement methods Understanding of departmental policies and procedures Strong fundamental knowledge of calculations and Operation metrics (SLA Metrics such as Productivity, Quality,etc) Knowledge of: Google Sheets/ MS Excel (PivotTable, Charts, Statistical functions and macro functions) Advance SQL PLX Google Apps Script or JavaScript

Posted 2 months ago

Apply

7.0 - 12.0 years

30 - 45 Lacs

Hyderabad, Gurugram, Chennai

Hybrid

Salary: 30 to 45 LPA Exp: 8 to 11 years Location: Bangalore/Gurgaon Notice: Immediate only..!! Key Skills: SQL, Advance SQL, BI tools, ETL etc Roles and Responsibilities Extract, manipulate, and analyze large datasets from various sources such as Hive, SQL databases, and BI tools. Develop and maintain dashboards using Tableau to provide insights on banking performance, market trends, and customer behavior. Collaborate with cross-functional teams to identify key performance indicators (KPIs) and develop data visualizations to drive business decisions. Desired Candidate Profile 6-10 years of experience in Data Analytics or related field with expertise in Banking Analytics, Business Intelligence, Campaign Analytics, Marketing Analytics, etc. . Strong proficiency in tools like Tableau for data visualization; Advance SQL knowledge preferred. Experience working with big data technologies like Hadoop ecosystem (Hive), Spark; familiarity with Python programming language required.

Posted 2 months ago

Apply

5.0 - 9.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Type : Contract with Client Experience : 7+ Must Have : Data Migration + Kingsway soft/Scribe/writing SSRS package JD Well versed with hands on Data migration along with ETL processes Hands on and excellent expertise on usage of Kingsway soft/Scribe/writing SSRS package Hands on expertise on PL / SQL and advance SQL Hands on and expertise DB operations Dynamics 365 knowledge/Salesforce CRM Knowledge Click Here to Apply Apply for this position Full Name * Email * Phone * Cover Letter * Upload CV/Resume *Allowed Type(s): pdf, doc, docx By using this form you agree with the storage and handling of your data by this website *

Posted 2 months ago

Apply

4.0 - 6.0 years

11 - 21 Lacs

Bengaluru

Hybrid

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Assistant Manager, Data Analytics- Senior SME! In this role, you will be focusing on fraud detection, AML/CTF, and transaction monitoring using SQL, Python, and BI tools to develop analytical solutions and enhance risk oversight. Strong stakeholder engagement and problem-solving skills are key. Responsibilities Support the managers and business leads to ensure that the respective TM/CRA/WLM/AEoI programs are working as intended and have appropriate oversight. Using advanced SQL/Python techniques to define analytical products which meet project needs and interpret business rules into code. Utilise analytics techniques in SQL & Python to model, design, and implement new transaction monitoring scenarios Deliver robust documentation, code and processes using Confluence, Gitlab, and SharePoint to ensure a clear audit trail of decisions, implementation and lineage of data products. Qualifications we seek in you! Minimum Qualifications / Skills Technical Skills: Intermediate SQL proficiency for data extraction, modeling, and analytics. Beginner Python skills for data analysis, scripting, and automation. Experience working with relational databases to manage and manipulate large datasets. Expertise in Business Intelligence & Data Visualization using tools like Power BI, Tableau, or Qlik Sense. Strong data quality management capabilities and ability to spot trends/ quality issues / anomalies in new data sources and identify ways to work around these issues. Soft Skills & Work Experience: Ability to translate business requirements into analytical solutions, working closely with both technical and non-technical stakeholders. Strong problem-solving mindset to detect anomalies, identify patterns, and enhance risk coverage. Ability to work under pressure and meet deadlines including the ability to multi-task, prioritise and balance competing demands and expectations. Simplify the complex – The ability to generate insight from data and engage and communicate those insights effectively with non-technical business customers. Strong documentation and governance skills, ensuring clear audit trails of decisions and data processes. Preferred Qualifications/ Skills Financial services experience especially within banking or wealth management. Experience in financial crime risk management, with emphasis on AML/CTF and Sanctions. Experience in AWS tool stack for analytics (EMR, S3, etc). Experience in data visualisation tools such as PowerBI, Qliksense or Tableau. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. For more information, visit www.genpact.com . Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.

Posted 2 months ago

Apply

7.0 - 12.0 years

30 - 45 Lacs

Noida, Pune, Gurugram

Hybrid

Role: Lead Data Engineer Experience: 7-12 years Must-Have: 7+ years of relevant experienceinData Engineeringand delivery. 7+ years of relevant work experience in Big Data Concepts. Worked on cloud implementations. Have experience in Snowflake, SQL, AWS (glue, EMR, S3, Aurora, RDS, AWS architecture) Good experience withAWS cloudand microservices AWS glue, S3, Python, and Pyspark. Good aptitude, strong problem-solving abilities, analytical skills, and ability to take ownership asappropriate. Should be able to do coding, debugging, performance tuning, and deploying the apps to the Production environment. Experience working in Agile Methodology Ability to learn and help the team learn new technologiesquickly. Excellentcommunication and coordination skills Good to have: Have experience in DevOps tools (Jenkins, GIT etc.) and practices, continuous integration, and delivery (CI/CD) pipelines. Spark, Python, SQL (Exposure to Snowflake), Big Data Concepts, AWS Glue. Worked on cloud implementations (migration, development, etc. Role & Responsibilities: Be accountable for the delivery of the project within the defined timelines with good quality. Working with the clients and Offshore leads to understanding requirements, coming up with high-level designs, and completingdevelopment,and unit testing activities. Keep all the stakeholders updated about the task status/risks/issues if there are any. Keep all the stakeholders updated about the project status/risks/issues if there are any. Work closely with the management wherever and whenever required, to ensure smooth execution and delivery of the project. Guide the team technically and give the team directions on how to plan, design, implement, and deliver the projects. Education: BE/B.Tech from a reputed institute.

Posted 2 months ago

Apply

1.0 - 3.0 years

15 - 20 Lacs

Pune

Work from Office

Company Overview With 80,000 customers across 150 countries, UKG is the largest U.S.-based private software company in the world. And we’re only getting started. Ready to bring your bold ideas and collaborative mindset to an organization that still has so much more to build and achieveRead on. Here, we know that you’re more than your work. That’s why our benefits help you thrive personally and professionally, from wellness programs and tuition reimbursement to U Choose — a customizable expense reimbursement program that can be used for more than 200+ needs that best suit you and your family, from student loan repayment, to childcare, to pet insurance. Our inclusive culture, active and engaged employee resource groups, and caring leaders value every voice and support you in doing the best work of your career. If you’re passionate about our purpose — people —then we can’t wait to support whatever gives you purpose. We’re united by purpose, inspired by you. Job Summary The Analytics Consultant I is a business intelligence focused expert that participates in the delivery of analytics solutions and reporting for various UKG products such as Pro, UKG Dimensions and UKG Datahub. The candidate is also responsible for interacting with other businesses and technical project stakeholders to gather business requirements and ensure successful delivery. The candidate should be able to leverage the strengths and capabilities of the software tools to provide an optimized solution to the customer. The Analytics Consultant I will also be responsible for developing custom analytics solutions and reports to specifications provided and support the solutions delivered. The candidate must be able to effectively communicate ideas both verbally and in writing at all levels in the organization, from executive staff to technical resources. The role requires working with the Program/Project manager, the Management Consultant, and the Analytics Consultants to deliver the solution based upon the defined design requirements and ensure it meets the scope and customer expectations. Key Responsibilities Interact with other businesses and technical project stakeholders to gather business requirements Deploy and Configure the UKG Analytics and Data Hub products based on the Design Documents Develop and deliver best practice visualizations and dashboards using a BI tools such as Cognos or BIRT or Power BI etc. Put together a test plan, validate the solution deployed and document the results Provide support during production cutover, and after go-live act as the first level of support for any requests that come through from the customer or other Consultants Analyze the customer’s data to spot trends and issues and present the results back to the customer Required Qualifications: 1-3 years’ experience designing and delivering Analytical/Business Intelligence solutions required Cognos, BIRT, Power BI or other business intelligence toolset experience required ETL experience using Talend or other industry standard ETL tools strongly preferred Advanced SQL proficiency is a plus Knowledge of Google Cloud Platform or Azure or something similar is desired, but not required Knowledge of Python is desired, but not required Willingness to learn new technologies and adapt quickly is required Strong interpersonal and problem-solving skills Flexibility to support customers in different time zones is required Where we’re going UKG is on the cusp of something truly special. Worldwide, we already hold the #1 market share position for workforce management and the #2 position for human capital management. Tens of millions of frontline workers start and end their days with our software, with billions of shifts managed annually through UKG solutions today. Yet it’s our AI-powered product portfolio designed to support customers of all sizes, industries, and geographies that will propel us into an even brighter tomorrow! UKGCareers@ukg.com

Posted 2 months ago

Apply

3.0 - 8.0 years

15 - 20 Lacs

Noida

Work from Office

Company Overview: With 80,000 customers across 150 countries, UKG is the largest U.S.-based private software company in the world. And we’re only getting started. Ready to bring your bold ideas and collaborative mindset to an organization that still has so much more to build and achieveRead on. Here, we know that you’re more than your work. That’s why our benefits help you thrive personally and professionally, from wellness programs and tuition reimbursement to U Choose — a customizable expense reimbursement program that can be used for more than 200+ needs that best suit you and your family, from student loan repayment, to childcare, to pet insurance. Our inclusive culture, active and engaged employee resource groups, and caring leaders value every voice and support you in doing the best work of your career. If you’re passionate about our purpose — people —then we can’t wait to support whatever gives you purpose. We’re united by purpose, inspired by you. The Analytics Consultant II (Level-2) is a business intelligence focused expert that participates in the delivery of analytics solutions and reporting for various UKG products such as Pro, UKG Dimensions and UKG Datahub. The candidate is also responsible for interacting with other businesses and technical project stakeholders to gather business requirements and ensure successful delivery. The candidate should be able to leverage the strengths and capabilities of the software tools to provide an optimized solution to the customer. The Analytics Consultant II will also be responsible for developing custom analytics solutions and reports to specifications provided and support the solutions delivered. The candidate must be able to effectively communicate ideas both verbally and in writing at all levels in the organization, from executive staff to technical resources. The role requires working with the Program/Project manager, the Management Consultant, and the Analytics Consultants to deliver the solution based upon the defined design requirements and ensure it meets the scope and customer expectations. Responsibilities includeInteract with other businesses and technical project stakeholders to gather business requirements Deploy and Configure the UKG Analytics and Data Hub products based on the Design Documents Develop and deliver best practice visualizations and dashboards using a BI tools such as Cognos or BIRT or Power BI etc. Put together a test plan, validate the solution deployed and document the results Provide support during production cutover, and after go-live act as the first level of support for any requests that come through from the customer or other Consultants Analyse the customer’s data to spot trends and issues and present the results back to the customer Qualification 3+ years’ experience designing and delivering Analytical/Business Intelligence solutions required Cognos, BIRT, Power BI or other business intelligence toolset experience required ETL experience using Talend or other industry standard ETL tools strongly preferred Advanced SQL proficiency is a plus Knowledge of Google Cloud Platform or Azure or something similar is desired, but not required Knowledge of Python is desired, but not required Willingness to learn new technologies and adapt quickly is required Strong interpersonal and problem-solving skills Flexibility to support customers in different time zones is required Where we’re going UKG is on the cusp of something truly special. Worldwide, we already hold the #1 market share position for workforce management and the #2 position for human capital management. Tens of millions of frontline workers start and end their days with our software, with billions of shifts managed annually through UKG solutions today. Yet it’s our AI-powered product portfolio designed to support customers of all sizes, industries, and geographies that will propel us into an even brighter tomorrow! UKG is proud to be an equal-opportunity employer and is committed to promoting diversity and inclusion in the workplace, including the recruitment process. Disability Accommodation UKGCareers@ukg.com

Posted 2 months ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Jaipur

Work from Office

: Job TitleAssociate Regulatory reporting team LocationJaipur, India Role Description The role is to perform a number of key functions that support and control the business in complying with a number regulatory requirements such as MiFID II, EMIR, CFTC and SFTR .This role forms part of a team in Bangalore that supports Regulatory reporting across all asset classesRates, Credit, Commodities, Equities, Loans and Foreign Exchange. Key responsibilities include day to day exception management MIS Compilation and User Acceptance Testing (UAT). This role will also indulge in supporting in-house tech requirements in terms of building out reports, macros etc. What well offer you 100% reimbursement under child care assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Performing and/or managing various exception management functions across reporting for all asset classes, across multiple jurisdictions Ensure accurate, timely and completeness of reporting Working closely with our technology development teams to design system solutions, the aim to automate as much of the exceptions process as possible Liaising with internal and external teams to propose developments to the current architecture in order to ensure greater compliance with Regulatory requirements and drive improved STP processing of our reporting across all asset classes Perform root cause analysis or exceptions with investigation & appropriate escalation of any significant issues found through testing, rejection remediation or any other stream to senior management to ensure transparency exists in our controls Ability to build and maintain effective operational process and prioritise activities based on risk. Clear communication and escalation. Ability to recognize high risk situations and deal with them in a prompt manner. Documentation of BI deliverables. Support the design of data models, reports and visualizations to meet business needs Develop end-user reports and visualizations Your skills and experience 5-8years work experience within an Ops role within financial services. Graduate in Science/Technology/Engg./Mathematics. Regulatory experience (MIFIR, EMIR, Dodd Frank, Bank of England etc.) is preferred Preferable experience in Middle Office/Back Office, Reference Data and excellent in Trade Life Cycle (At least 2 asset Classes Equities, Credits, Rates, Foreign Exchange, Commodities) Ability to work independently, as well as in a team environment Clear and concise communication and escalation. Ability to recognise high risk situations and deal with them in a prompt manner. Ability to identify and prioritize multiple tasks that have potential operational risk and p/l impact in an often high-pressure environment Experience in data analysis with intermediate/advanced Microsoft Office Suite skills including VBA. Experience in building reports and BI analysis with tools such as SAP Business Objects, Tableau, QlikView etc. Advanced SQL Experience is preferred. How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 2 months ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Experience 8+ years Data Engineering experience. 3+ years experience of cloud platform services (preferably GCP) 2+ years hands-on experience on Pentaho. Hands-on experience in building and optimizing data pipelines and data sets. Hands-on experience with data extraction and transformation tasks while taking care of data security, error handling and pipeline performance. Hands-on experience with relational SQL (Oracle, SQL Server or MySQL) and NoSQL databases . Advance SQL experience - creating, debugging Stored Procedures, Functions, Triggers and Object Types in PL/SQL Statements. Hands-on experience with programming languages - Java (mandatory), Go, Python. Hands-on experience in unit testing data pipelines. Experience in using Pentaho Data Integration (Kettle/Spoon) and debugging issues. Experience supporting and working with cross-functional teams in a dynamic environment. Technical Skills Programming & LanguagesJAVA Database TechOracle, Spanner, BigQuery, Cloud Storage Operating SystemsLinux Good knowledge and understanding of cloud based ETL framework and tools. Good understanding and working knowledge of batch and streaming data processing. Good understanding of the Data Warehousing architecture. Knowledge of open table and file formats (e.g. delta, hudi, iceberg, avro, parquet, json, csv) Strong analytic skills related to working with unstructured datasets. Excellent numerical and analytical skills. Responsibilities Design and develop various standard/reusable to ETL Jobs and pipelines. Work with the team in extracting the data from different data sources like Oracle, cloud storage and flat files. Work with database objects including tables, views, indexes, schemas, stored procedures, functions, and triggers. Work with team to troubleshoot and resolve issues in job logic as well as performance. Write ETL validations based on design specifications for unit testing Work with the BAs and the DBAs for requirements gathering, analysis, testing, metrics and project coordination.

Posted 2 months ago

Apply

8.0 - 13.0 years

4 - 8 Lacs

Mumbai

Work from Office

4+ years of experience as a Data Engineer or similar role. Proficiency in Python, PySpark, and advanced SQL. Hands-on experience with big data tools and frameworks (e.g., Spark, Hive). Experience with cloud data platforms like AWS, Azure, or GCP is a plus. Solid understanding of data modeling, warehousing, and ETL processes. Strong problem-solving and analytical skills. Good communication and teamwork abilities.Design, build, and maintain data pipelines that collect, process, and store data from various sources. Integrate data from multiple heterogeneous sources such as databases (SQL/NoSQL), APIs, cloud storage, and flat files. Optimize data processing tasks to improve execution efficiency, reduce costs, and minimize processing times, especially when working with large-scale datasets in Spark. Design and implement data warehousing solutions that centralize data from multiple sources for analysis.

Posted 2 months ago

Apply

5.0 - 10.0 years

7 - 11 Lacs

Pune

Work from Office

Provide expertise in analysis, requirements gathering, design, coordination, customization, testing and support of reports, in client’s environment Develop and maintain a strong working relationship with business and technical members of the team Relentless focus on quality and continuous improvement Perform root cause analysis of reports issues Development / evolutionary maintenance of the environment, performance, capability and availability. Assisting in defining technical requirements and developing solutions Effective content and source-code management, troubleshooting and debugging Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5+ years of experience with BI tools, with expertise and/or certification in at least one major BI platform - Tableau preferred. Advanced knowledge of SQL, including the ability to write complex stored procedures, views, and functions. Proven capability in data storytelling and visualization, delivering actionable insights through compelling presentations. Excellent communication skills, with the ability to convey complex analytical findings to non-technical stakeholders in a clear, concise, and meaningful way. Identifying and analyzing industry trends, geographic variations, competitor strategies, and emerging customer behavior Preferred technical and professional experience Troubleshooting capabilities to debug Data controls Capable of converting business requirements into workable model. Good communication skills, willingness to learn new technologies, Team Player, Self-Motivated, Positive Attitude. Must have thorough understanding of SQL & advance SQL (Joining & Relationships)

Posted 2 months ago

Apply

5.0 - 8.0 years

4 - 8 Lacs

Telangana

Work from Office

Education Bachelors degree in Computer Science, Engineering, or a related field. A Masters degree is preferred. Experience Minimum of 4+ years of experience in data engineering or a similar role. Strong programming skills in Python programming and advance SQL. strong experience in NumPy, Pnadas, Data frames Strong analytical and problem-solving skills. Excellent communication and collaboration abilities.

Posted 2 months ago

Apply

7.0 - 12.0 years

3 - 7 Lacs

Hyderabad

Work from Office

Job type Contract to HIRE Minimum 5+ years of professional experience with Manhattan Distributed Order Management. Strong background in direct-to-consumer processes, distributed order management, supply chain, and fulfillment operations using Manhattan OMS. Expertise in various Manhattan modules such as Enterprise Order Management (EOM), Distributed Order Management (DOM), Call Center, Order Orchestration, Enterprise Inventory, Available to Commerce (ATC), and Supply Chain Intelligence (SCI). Experience with SCI Reporting (IBM Cognos) for building, scheduling, and modifying reports and dashboards. REST APIs, Java, JSON, data mapping & SQL. Advanced SQL knowledge and experience is preferred. Proven grasp of architecture and integration design. Experience supporting full Agile and Waterfall software development lifecycles, including understanding business processes, assembling user requirements, design, testing, deployment, and training. Experience with payment processing, fraud, tax, warehouse management, fulfillment solutions, and ERP integrations

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies