Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
3.0 - 6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Requisition Id : 1603797 As a global leader in assurance, tax, transaction and advisory services, we hire and develop the most passionate people in their field to help build a better working world. This starts with a culture that believes in giving you the training, opportunities and creative freedom. At EY, we don't just focus on who you are now, but who you can become. We believe that it’s your career and ‘It’s yours to build’ which means potential here is limitless and we'll provide you with motivating and fulfilling experiences throughout your career to help you on the path to becoming your best professional self. The opportunity : Associate Consultant-TMT-Assurance-ASU - TR - Technology Risk - Hyderabad TMT : Industry convergence offers TMT (Technology, Media & Entertainment, and Telecommunications) organizations the chance to evolve and transform, but it also presents challenges around competitiveness and delivering agile corporate strategies for growth. We help TMT companies create compelling employee and customer experiences, retaining skills and talent while achieving enterprise-wide operational excellence. We help them guard their data, brand and reputation. We also enable the pursuit of M&A strategies that methodically create value, reduce risk and transform TMT companies into powerhouses that will lead the technology revolution of the future – building a better working world for all. ASU - TR - Technology Risk : Assurance’s purpose is to inspire confidence and trust to enable a complex world to work. We do so by protecting and serving the public interest, promoting transparency, supporting investor confidence and economic growth and fostering talent to provide future business leaders. We help clients by: Ensuring their accounts comply with the requisite audit standards Providing a robust and clear perspective to audit committees and Providing critical information for stakeholders. Our Service Offerings include External Audit, Financial Accounting Advisory Services (FAAS), IFRS & US GAAP conversion, IPO and other public offering, Corporate Treasury - IFRS 9 accounting & implementation support etc Your key responsibilities Technical Excellence Your key responsibilities To carry out operational, financial, process and Systems audits designed to review and appraise its activities, systems, and controls, which includes: Identifying accounting & auditing issues; discuss with audit in-charge to solve issues that arise, To carry out substantive testing of appliances in accordance with the audit plan formulated & appropriately document all work performed, Identifying improvements to control systems and procedures Presenting audit reports clearly highlighting key audit recommendations to management, Preparing & maintaining statutory books of accounts, audit, reconciliation of account receivable & payable, finalization of accounts, Assist the seniors in reviewing & checking the financial statements & preparation of the audit report, management comment letter & management representation letter Skills and attributes for success Must have strong knowledge of auditing & accounting standards, Should have handled statutory audit assignments earlier, Ability to prioritize work on multiple assignments & manage ambiguity, Strong verbal and communication skills, Clarity of thoughts and assertive, Effectiveness and creativity of written expression - logical, readability and conciseness, Ability to meet deadlines. Audit Analytics Foundational analytics in areas such as Journal entry testing, Hire to Retire, Procure to Pay, Acquire to Retire Order to Cash, Record to Report, Production and Inventory, Re-performance testing Sector specific analytics (Advanced/Custom analytics) Visualization Automated analytics model development for statutory audit enablements Data extraction from Client ERPs Design, review and rectify algorithms, systems and processes to analyze client data, as well as implement them in the appropriate tools and technologies Hands on experience in analytics in financial reporting Hands on experience in Machine Learning using R or Python with strong statistical background is a must Knowledge of databases, ETL, and hands on experience / theoretical knowledge in SQL Experience in any of the visualization tools like Tableau, Spotfire, Qlikview, etc Knowledge and experience in ERP such as SAP and / or Oracle and data extraction Proficiency in MS Office Suite (advanced Excel skills & Macros) Experience in NLP/ Web Scraping/ Log Analytics/ TensorFlow/ AI / Beautiful Soup Experience in Iron Python/JavaScript Skills and attributes To qualify for the role you must have Qualification BE/ B.Tech,, or MSC in Computer Science/Statistics or M.C.A Experience 3 - 6 years of relevant experience What we look for People with the ability to work in a collaborative manner to provide services across multiple client departments while following the commercial and legal requirements. You will need a practical approach to solving issues and complex problems with the ability to deliver insightful and practical solutions. We look for people who are agile, curious, mindful and able to sustain postivie energy, while being adaptable and creative in their approach. What we offer With more than 200,000 clients, 300,000 people globally and 33,000 people in India, EY has become the strongest brand and the most attractive employer in our field, with market-leading growth over compete. Our people work side-by-side with market-leading entrepreneurs, game- changers, disruptors and visionaries. As an organisation, we are investing more time, technology and money, than ever before in skills and learning for our people. At EY, you will have a personalized Career Journey and also the chance to tap into the resources of our career frameworks to better know about your roles, skills and opportunities. EY is equally committed to being an inclusive employer and we strive to achieve the right balance for our people - enabling us to deliver excellent client service whilst allowing our people to build their career as well as focus on their wellbeing. If you can confidently demonstrate that you meet the criteria above, please contact us as soon as possible. Join us in building a better working world. Apply now. Show more Show less
Posted 4 days ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Welcome to Warner Bros. Discovery… the stuff dreams are made of. Who We Are… When we say, “the stuff dreams are made of,” we’re not just referring to the world of wizards, dragons and superheroes, or even to the wonders of Planet Earth. Behind WBD’s vast portfolio of iconic content and beloved brands, are the storytellers bringing our characters to life, the creators bringing them to your living rooms and the dreamers creating what’s next… From brilliant creatives, to technology trailblazers, across the globe, WBD offers career defining opportunities, thoughtfully curated benefits, and the tools to explore and grow into your best selves. Here you are supported, here you are celebrated, here you can thrive. Analytics Engineer II - Hyderabad, India . About Warner Bros. Discovery Warner Bros. Discovery, a premier global media and entertainment company, offers audiences the world's most differentiated and complete portfolio of content, brands and franchises across television, film, streaming and gaming. The new company combines Warner Media’s premium entertainment, sports and news assets with Discovery's leading non-fiction and international entertainment and sports businesses. For more information, please visit www.wbd.com. Roles & Responsibilities As a Analytics Engineer II, you will perform data analytics and data visualization-related efforts for the Data & Analytics organization at WBD. You’re an engineer who not only understands how to use big data in answering complex business questions but also how to design semantic layers to best support self-service vehicles. You will manage projects from requirements gathering to planning to implementation of full-stack data solutions (pipelines to data tables to visualizations) with the support of the larger team. You will work closely with cross-functional partners to ensure that business logic is properly represented in the semantic layer and production environments, where it can be used by the wider Data & Analytics team to drive business insights and strategy. Design and implement data models that support flexible querying and data visualization Partner with stakeholders to understand business questions and build out advanced analytical solutions Advance automation efforts that help the team spend less time manipulating & validating data and more time analyzing Build frameworks that multiply the productivity of the team and are intuitive for other data teams to leverage Participate in the creation and support of analytics development standards and best practices Create systematic solutions for solving data anomalies: identifying, alerting, and root cause analysis Work proactively with stakeholders to understand the business need and build data analytics capabilities – especially in large enterprise use cases Identify and explore new opportunities through creative analytical and engineering methods What To Bring Bachelor's degree, MS or greater in a quantitative field of study (Computer/Data Science, Engineering, Mathematics, Statistics, etc.) 3+ years of relevant experience in business intelligence/data engineering Expertise in writing SQL (clean, fast code is a must) and in data-warehousing concepts such as star schemas, slowly changing dimensions, ELT/ETL, and MPP databases Experience in transforming flawed/changing data into consistent, trustworthy datasets Experience with general-purpose programming (e.g. Python, Scala or other) dealing with a variety of data structures, algorithms, and serialization formats will be a plus Experience with big-data technologies (e.g. Spark, Hadoop, Snowflake etc) Advanced ability to build reports and dashboards with BI tools (such as Looker, Tableau or PowerBI) Experience with analytics tools such as Athena, Redshift, BigQuery or Snowflake Proficiency with Git (or similar version control) and CI/CD best practices will be a plus Ability to write clear, concise documentation and to communicate generally with a high degree of precision. Ability to solve ambiguous problems independently Ability to manage multiple projects and time constraints simultaneously Care for the quality of the input data and how the processed data is ultimately interpreted and used Prior experience in large enterprise use cases such as Sales Analytics, Financial Analytics, or Marketing Analytics Strong written and verbal communication skills Characteristics & Traits Naturally inquisitive, critical thinker, proactive problem-solver, and detail-oriented Positive attitude and an open mind Strong organizational skills with the ability to act independently and responsibly Self-starter, comfortable initiating projects from design to execution with minimal supervision Ability to manage and balance multiple (and sometimes competing) priorities in a fast-paced, complex business environment and can manage time effectively to consistently meet deadlines Team player and relationship builder What We Offer A Great Place to work. Equal opportunity employer Fast track growth opportunities How We Get Things Done… This last bit is probably the most important! Here at WBD, our guiding principles are the core values by which we operate and are central to how we get things done. You can find them at www.wbd.com/guiding-principles/ along with some insights from the team on what they mean and how they show up in their day to day. We hope they resonate with you and look forward to discussing them during your interview. Championing Inclusion at WBD Warner Bros. Discovery embraces the opportunity to build a workforce that reflects a wide array of perspectives, backgrounds and experiences. Being an equal opportunity employer means that we take seriously our responsibility to consider qualified candidates on the basis of merit, regardless of sex, gender identity, ethnicity, age, sexual orientation, religion or belief, marital status, pregnancy, parenthood, disability or any other category protected by law. If you’re a qualified candidate with a disability and you require adjustments or accommodations during the job application and/or recruitment process, please visit our accessibility page for instructions to submit your request. Show more Show less
Posted 4 days ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Company Description Blend is a premier AI services provider, committed to co-creating meaningful impact for its clients through the power of data science, AI, technology, and people. With a mission to fuel bold visions, Blend tackles significant challenges by seamlessly aligning human expertise with artificial intelligence. The company is dedicated to unlocking value and fostering innovation for its clients by harnessing world-class people and data-driven strategy. We believe that the power of people and AI can have a meaningful impact on your world, creating more fulfilling work and projects for our people and clients. For more information, visit www.blend360.com. Job Description We are looking for a skilled Test Engineer with experience in automated testing, rollback testing, and continuous integration environments. You will be responsible for ensuring the quality and reliability of our software products through automated testing strategies and robust test frameworks Design and execute end-to-end test strategies for data pipelines, ETL/ELT jobs, and database systems. Validate data quality, completeness, transformation logic, and integrity across distributed data systems (e.g., Hadoop, Spark, Hive). Develop Python-based automated test scripts to validate data flows, schema validations, and business rules. Perform complex SQL queries to verify large datasets across staging and production environments. Identify data issues and work closely with data engineers to resolve discrepancies. Contribute to test data management, environment setup, and regression testing processes. Work collaboratively with data engineers, business analysts, and QA leads to ensure accurate and timely data delivery. Participate in sprint planning, reviews, and defect triaging as part of the Agile process. Qualifications 4+ of experience in Data Testing, Big Data Testing, and/or Database Testing. Strong programming skills in Python for automation and scripting. Expertise in SQL for writing complex queries and validating large datasets. Experience with Big Data technologies such as Hadoop, Hive, Spark, HDFS, Kafka (any combination is acceptable). Hands-on experience with ETL/ELT testing and validating data transformations and pipelines. Exposure to cloud data platforms like AWS (Glue, S3, Redshift), Azure (Data Lake, Synapse), or GCP is a plus. Familiarity with test management and defect tracking tools like JIRA, TestRail, or Zephyr. Experience with CI/CD pipelines and version control (e.g., Git) is an advantage. Show more Show less
Posted 4 days ago
5.0 years
0 Lacs
India
Remote
About Frazier & Deeter Frazier & Deeter (FD) is an award-winning accounting & advisory firm. FD and our family of brands serve clients worldwide, from the Fortune Global 500 companies to growing small businesses. Frazier & Deeter cultivates a growth mindset and instils in our people the belief that we must be adaptive and entrepreneurial, and we make a difference for our clients and each other. We focus on our brand promise of Investing in Relationships to Make a Difference. FD offers a full range of tax, audit, accounting, and advisory services through our offices in Atlanta, Charlotte, Las Vegas, Nashville, Alpharetta, and Tampa, in addition to London and India. We have been recognized repeatedly as a Top 50 firm, a Best of the Best Accounting firm, a Best Firm to Work For® and a Best Firm for Women in Leadership. FD recently opened its own entity in India to support our expanding international and cross-border client base. We are not your typical firm of accountants and we like doing things differently. Our team members have the opportunity to be involved in global consulting and advisory client facing roles in a fast growing and exciting new business in India, which is part of our established and highly regarded accountancy firms in the US and UK. About the Role We are seeking a versatile and curious Software Engineer to join our internal technology team. This is a hands-on, full-stack development and automation role suited for a generalist who thrives on solving business problems through data, scripting, and integration. You'll collaborate with our Atlanta-based team and business users to build new tools, streamline workflows, and automate processes that drive efficiency and insight across the organization. We build custom applications (primarily in Python/Django), leverage tools like Microsoft Power Automate for workflow automation, and work extensively in Salesforce for business operations. This role demands flexibility, eagerness to learn, and a strong desire to understand how data flows across systems to unlock business value. Key Responsibilities Build and enhance internal Python/Django applications , hosted on AWS and managed via Terraform . Create and support automations using Power Automate , Salesforce Flows , and custom scripts (Python, Java, JavaScript, etc.). Integrate with vendor platforms via APIs to ensure seamless cross-system functionality. Perform data analysis, build reports or dashboards (e.g., Power BI), and help business users gain actionable insights from data. Contribute to Salesforce Administration : managing objects, fields, permissions, validation rules, and flows. Collaborate across technical and non-technical teams to gather requirements and deliver user-focused solutions. Participate in agile ceremonies , code reviews , and documentation . Support change management and source control practices using GitHub . Required Skills and Experience 5+ years in a software engineering or automation-focused role. Proficient in Python; experience with other scripting languages (Java, JavaScript) is a plus. Familiar with data analysis concepts; experience with Power BI or equivalent is preferred. Comfortable with APIs, REST/SOAP integrations, and/or ETL pipelines. Experience with version control (GitHub) and agile practices. Able to collaborate across time zones with partial overlap with the US Eastern workday. Passion for automating repetitive tasks and improving operational efficiency. Strong communicator, with the ability to explain technical concepts to non-technical users. Preferred Qualifications Experience with Microsoft Power Automate or similar workflow tools. Experience in Salesforce administration or building declarative Salesforce flows. Exposure to AWS, Terraform, or infrastructure-as-code principles. Previous experience in a professional services, finance, or accounting environment. What FD offers Competitive salary Flexible working hours, Hybrid (home/office) and Remote working arrangements possible. Clear career growth path within the firm Personal and professional skills development and training support - remote Show more Show less
Posted 4 days ago
40.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Looking for a DevOps Senior Engineer in the Data Engineering team who can help us support next-generation Analytics applications over Oracle cloud. This posting is for DevOps Senior Engineer in the Oracle Analytics Warehouse product development organization. Fully handled Cloud service that provides customers a turn-key enterprise warehouse on the cloud for Fusion Applications. The service is being built on a sophisticated technology stack demonstrating a brand-new data integration platform and the industry's most sophisticated analytical business analytics platform. https://www.oracle.com/solutions/business-analytics/analytics-for-applications.html We are looking for senior engineer with experience in supporting data warehousing products. As a member of the Product development organization, focus will be on working with development teams, providing timely support to customers and identify/implementing process automation, for cloud BI product. BS or equivalent experience or higher degree in Computer Science / Engineering or equivalent from top university validated experience, supporting business customers on any Cloud/On-premise BI Application Experience in SQL/PL-SQL and excellent de-bugging skills Experience in Diagnosing network latency and intermittent issues, Reading and analyzing log files Good Functional Knowledge in ERP, Finance, HCM or EBS domain Working experience with any ERP/in-demand application such as Oracle EBS, Fusion is helpful Good programming skills in Python/Java Exposure to cloud infrastructure, Oracle Cloud Infrastructure (OCI) is helpful Experience in performance tuning SQL and understanding ETL pipelines Build, Configure, Manage and Coordinate all Build and Release engineering activities Strong logical/critical thinking and problem resolution skill Excellent interpersonal skills Career Level - IC2 Responsibilities Roles and Responsibilities: As member of Pipeline Production Operations, you will address customer issues and tickets within defined SLA’s Proactively identify and resolve potential problems in an effort to prevent them from occurring and improve the overall customer experiences You will approach each case with a goal of ensuring Oracle Analytics products are performing at an efficient level by addressing any underlying or additional problems uncovered during each Customer engagement. Co-ordinate and connect with different team members to formulate the solutions to customer issues You will ensure full understanding of the issue, including impact to customer. You will recommend solutions to customers and follow through to resolution or escalate the case in a timely manner if no resolution can be found. Bring together logs, configuration details and attempt to reproduce the reported issues. Develop and improve Knowledge base for the issues and their solutions. Participate in knowledge sharing via involvement in technical discussions and Knowledge Base documentation. Prioritize workload based on severity and demonstrate a sense of urgency when handling cases. Find opportunities for process improvements and automation through building right utilities/tools Willing to be working in Shifts and weekends based on support rota. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 4 days ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description TriNet is a leading provider of comprehensive human resources solutions for small to midsize businesses (SMBs). We enhance business productivity by enabling our clients to outsource their HR function to one strategic partner and allowing them to focus on operating and growing their core businesses. Our full-service HR solutions include features such as payroll processing, human capital consulting, employment law compliance and employee benefits, including health insurance, retirement plans and workers’ compensation insurance. TriNet has a nationwide presence and an experienced executive team. Our stock is publicly traded on the NYSE under the ticker symbol TNET. If you’re passionate about innovation and making an impact on the large SMB market, come join us as we power our clients’ business success with extraordinary HR. Don't meet every single requirement? Studies have shown that many potential applicants discourage themselves from applying to jobs unless they meet every single requirement. TriNet always strives to hire the most qualified candidate for a particular role, ensuring we deliver outstanding results for our small and medium-size customers. So if you're excited about this role but your past experience doesn't align perfectly with every single qualification in the job description, nobody’s perfect – and we encourage you to apply. You may just be the right candidate for this or other roles. A Brief Overview The Senior Product Manager – Data Enablement role will be responsible for the roadmap, requirements, and orchestration of the implementation of shared hyperautomation solutions (i.e., Business Process Automation, Workflow Orchestration, Intelligent Document Processing, Data Integrations,). What You Will Do Application Vision & Strategy Define and communicate the application vision and strategy. Develop and maintain a roadmap based on this vision and applicable use cases and business needs. Ensure the solutions align with the company’s overall strategy and goals. Backlog Management Translate the strategy into detailed requirements Manage the backlog, including prioritizing tasks based on changing requirements. Incorporate feature requests into the product roadmap. Develop user stories and define acceptance criteria. Development Oversight Oversee all stages of creation, including design and development. Monitor and evaluate progress at each stage of the process. Participate in Scrum meetings and sprints. Stakeholder Collaboration Collaborate with prospective users to understand and anticipate their needs. Work with the team and end-users to deliver updates and status reports. Communicate goals, progress, and outcomes to senior management. Release Planning Plan releases and upgrades. Follow the progress of work and address production issues during sprints. Ensure timely delivery of features and enhancements. Continuous Improvement Analyze preferences and requests of end-users. Refine the agile methodology based on results and client feedback. Keep track of industry trends and incorporate them into the product development process. Work Hygiene Demonstrate exceptional organizational skills, attention to detail, and the ability to work collaboratively across the organization to drive impactful results. Risk Management Identify data-related risks and develop mitigation strategies. Ensure data solutions support data privacy and security requirements. Cost Benefit Analysis Conducting ROI analysis and making data driven decisions to justify automation investments. Education Qualifications Bachelor's Degree in information science, data management, computer science or related field or equivalent experience preferred Experience Qualifications Typically 5+ years experience in product management or analysis. Typically 2+ years Experience working with hyperautomation solutions. Skills And Abilities Proficiency in complex process automation and orchestration tools and technologies (e.g., Business Process Orchestration/Automation, Robotic Process Automation). Experience with data integration tools and technologies (e.g., ETL, data pipelines, API integrations). Strong understanding of data management concepts including data quality, data governance, master data management. Excellent analytical and problem-solving abilities, with a keen eye for detail. Strong communication and collaboration skills, with the ability to effectively convey complex automation-related concepts to both technical and non-technical stakeholders. Ability to work independently and in a team environment, managing multiple priorities and delivering high-quality results within deadlines. Knowledge of industry standards and best practices related to hyperautomation . Continuous learning mindset, staying up-to-date with the latest hyperautomation trends and technologies. Licenses and Certifications N/A Travel Requirements Minimal Work Environment Work in a clean, pleasant, and comfortable office work setting. The work environment characteristics described here are representative of those an employee encounters while performing the essential functions of this job. Reasonable accommodations may be made to enable persons with disabilities to perform the essential functions. This position is 100% in office. Please Note: TriNet reserves the right to change or modify job duties and assignments at any time. The above job description is not all encompassing. Position functions and qualifications may vary depending on business necessity. TriNet is an Equal Opportunity Employer and does not discriminate against applicants based on race, religion, color, disability, medical condition, legally protected genetic information, national origin, gender, sexual orientation, marital status, gender identity or expression, sex (including pregnancy, childbirth or related medical conditions), age, veteran status or other legally protected characteristics. Any applicant with a mental or physical disability who requires an accommodation during the application process should contact recruiting@trinet.com to request such an accommodation. Show more Show less
Posted 4 days ago
6.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Job Description Oracle Customer Success Services Building on the mindset that "Who knows Oracle …. better than Oracle?" Oracle Customer Success Services assists customers with their requirements for some of the most cutting-edge applications and solutions by utilizing the strengths of more than two decades of expertise in developing mission-critical solutions for enterprise customers and combining it with cutting-edge technology to provide our customers' speed, flexibility, resiliency, and security to enable customers to optimize their investment, minimize risk, and achieve more. The business was established with an entrepreneurial mindset and supports a vibrant, imaginative, and highly varied workplace. We are free of obligations, so we'll need your help to turn it into a premier engineering hub that prioritizes quality. Why? Oracle Customer Success Services Engineering is responsible for designing, building, and managing cutting-edge solutions, services, and core platforms to support the managed cloud business including but not limited to Oracle Cloud Infrastructure (OCI), Oracle Cloud Applications (SaaS) & Oracle Enterprise Applications. This position is for CSS Architecture Team, and we are searching for the finest and brightest technologists as we begin on the road of cloud-native digital transformation. We operate under a garage culture, rely on cutting-edge technology in our daily work, and provide a highly innovative, creative, and experimental work environment. We prefer to innovate and move quickly, putting a strong emphasis on scalability and robustness. We need your assistance to build a top-tier engineering team that has a significant influence. What? As a Principal Data Science & AIML Engineer within the CSS CDO Architecture & Platform team, you’ll lead efforts in designing and building scalable, distributed, resilient services that provide artificial intelligence and machine learning capabilities on OCI & Oracle Cloud Applications for the business. You will be responsible for the design and development of machine learning systems and applications, ensuring they meet the needs of our clients and align with the company's strategic objectives. The ideal candidate will have extensive experience in machine learning algorithms, model creation and evaluation, data engineering and data processing for large scale distributed systems, and software development methodologies. We strongly believe in ownership and challenging the status quo. We expect you to bring critical thinking and long-term design impact while building solutions and products defining system integrations, and cross-cutting concerns. Being part of the architecture function also provides you with the unique ability to enforce new processes and design patterns that will be future-proof while building new services or products. As a thought leader, you will own and lead the complete SDLC from Architecture Design, Development, Test, Operational Readiness, and Platform SRE. Responsibilities As a member of the architecture team, you will be in charge of designing software products, services, and platforms, as well as creating, testing, and managing the systems and applications we create in line with the architecture patterns and standards. As a core member of the Architecture Chapter, you will be expected to advocate for the adoption of software architecture and design patterns among cross-functional teams both within and outside of engineering roles. You will also be expected to act as a mentor and act in capacity as an advisor to the team(s) within the software and AIML domain. As we push for digital transformation throughout the organization, you will constantly be expected to think creatively and optimize and harmonize business processes. Core Responsibilities Lead the development of machine learning models, integration with full stack software ecosystem, data engineering and contribute to the design strategy. Collaborate with product managers and development teams to identify software requirements and define project scopes. Develop and maintain technical documentation, including architecture diagrams, design specifications, and system diagrams. Analyze and recommend new software technologies and platforms to ensure the company stays ahead of the curve. Work with development teams to ensure software projects are delivered on time, within budget, and to the required quality standards. Provide guidance and mentorship to junior developers. Stay up-to-date with industry trends and developments in software architecture and development practices. Required Qualifications Bachelor's or Master's Degree in Computer Science, Machine Learning/AI, or a closely related field. 6+ years of experience in software development, machine learning, data science, and data engineering design. Proven ability to build and manage enterprise-distributed and/or cloud-native systems. Broad knowledge of cutting-edge machine learning models and strong domain expertise in both traditional and deep learning, particularly in areas such as Recommendation Engines, NLP & Transformers, Computer Vision, and Generative AI. Advanced proficiency in Python and frameworks such as FastAPI, Dapr & Flask or equivalent. Deep experience with ML frameworks such as PyTorch, TensorFlow, and Scikit-learn. Hands-on experience building ML models from scratch, transfer learning, and Retrieval Augmented Generation (RAG) using various techniques (Native, Hybrid, C-RAG, Graph RAG, Agentic RAG, and Multi-Agent RAG). Experience building Agentic Systems with SLMs and LLMs using frameworks like Langgraph + Langchain, AutoGen, LlamaIndex, Bedrock, Vertex, Agent Development Kit, Model Context Protocol (MCP)and Haystack or equivalent. Experience in Data Engineering using data lakehouse stacks such as ETL/ELT, and data processing with Apache Hadoop, Spark, Flink, Beam, and dbt. Experience with Data Warehousing and Lakes such as Apache Iceberg, Hudi, Delta Lake, and cloud-managed solutions like OCI Data Lakehouse. Experience in data visualization and analytics with Apache Superset, Apache Zeppelin, Oracle Analytics Cloud or similar. Hands-on experience working with various data types and storage formats, including NoSQL, SQL, Graph databases, and data serialization formats like Parquet and Arrow. Experience with real-time distributed systems using streaming data with Kafka, NiFi, or Pulsar. Strong expertise in software design concepts, patterns (e.g., 12-Factor Apps), and tools to create CNCF-compliant software with hands-on knowledge of containerization technologies like Docker and Kubernetes. Proven ability to build and deploy software applications on one or more public cloud providers (OCI, AWS, Azure, GCP, or similar). Demonstrated ability to write full-stack applications using polyglot programming with languages/frameworks like FastAPI, Python, and Golang. Experience designing API-first systems with application stacks like FARM and MERN, and technologies such as gRPC and REST. Solid understanding of Design Thinking, Test-Driven Development (TDD), BDD, and end-to-end SDLC. Experience in DevOps practices, including Kubernetes, CI/CD, Blue-Green, and Canary deployments. Experience with Microservice architecture patterns, including API Gateways, Event-Driven & Reactive Architecture, CQRS, and SAGA. Familiarity with OOP design principles (SOLID, DRY, KISS, Common Closure, and Module Encapsulation). Proven ability to design software systems using various design patterns (Creational, Structural, and Behavioral). Strong interpersonal skills and the ability to effectively communicate with business stakeholders. Demonstrated ability to drive technology adoption in AIML Solutions and CNCF software stack. Excellent analytical, problem-solving, communication, and leadership skills. Qualifications Career Level - IC4 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 4 days ago
7.0 - 12.0 years
7 - 12 Lacs
Hyderabad
Work from Office
Job Summary: We are seeking a skilled Database Developer (SQL) to design, develop, and maintain efficient database solutions. The ideal candidate will have expertise in SQL development, database optimization, and data integrity management. This role involves working closely with cross-functional teams to develop and support database applications, ensuring performance, security, and scalability. Location: Bangalore Mode of Work: Work from office (5 days) Key Responsibilities: Design, develop, and optimize SQL databases to support business applications. Write complex SQL queries, stored procedures, functions, and triggers for efficient data retrieval and processing. Ensure database performance tuning, indexing, and optimization techniques. Maintain database integrity, security, and compliance with industry standards. Collaborate with software developers and analysts to understand database requirements. Develop and maintain ETL processes for data extraction, transformation, and loading. Troubleshoot database-related issues and provide timely resolutions. Monitor database performance and conduct regular maintenance tasks. Document database designs, processes, and best practices. Preferred Skills & Qualifications: Strong proficiency in SQL and relational database management systems (RDBMS) such as Microsoft SQL Server, MySQL, or PostgreSQL. Experience in writing and optimizing complex SQL queries and stored procedures. Knowledge of database indexing, normalization, and performance tuning techniques. Understanding of database security best practices. Familiarity with ETL tools and processes. Experience in data modeling and database design principles. Ability to work with large datasets and optimize query performance. Strong problem-solving and analytical skills. Desired Skills: Experience with cloud-based database solutions (Azure SQL, AWS RDS, or Google Cloud SQL). Knowledge of NoSQL databases such as MongoDB or Cassandra. Exposure to database automation and DevOps practices. Experience with reporting and BI tools like Power BI, Tableau, or SSRS. Strong communication and collaboration skills to work effectively with development teams. Ability to adapt to fast-paced environments and manage multiple database projects.
Posted 4 days ago
6.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Job Description Oracle Customer Success Services Building on the mindset that "Who knows Oracle …. better than Oracle?" Oracle Customer Success Services assists customers with their requirements for some of the most cutting-edge applications and solutions by utilizing the strengths of more than two decades of expertise in developing mission-critical solutions for enterprise customers and combining it with cutting-edge technology to provide our customers' speed, flexibility, resiliency, and security to enable customers to optimize their investment, minimize risk, and achieve more. The business was established with an entrepreneurial mindset and supports a vibrant, imaginative, and highly varied workplace. We are free of obligations, so we'll need your help to turn it into a premier engineering hub that prioritizes quality. Why? Oracle Customer Success Services Engineering is responsible for designing, building, and managing cutting-edge solutions, services, and core platforms to support the managed cloud business including but not limited to Oracle Cloud Infrastructure (OCI), Oracle Cloud Applications (SaaS) & Oracle Enterprise Applications. This position is for CSS Architecture Team, and we are searching for the finest and brightest technologists as we begin on the road of cloud-native digital transformation. We operate under a garage culture, rely on cutting-edge technology in our daily work, and provide a highly innovative, creative, and experimental work environment. We prefer to innovate and move quickly, putting a strong emphasis on scalability and robustness. We need your assistance to build a top-tier engineering team that has a significant influence. What? As a Principal Data Science & AIML Engineer within the CSS CDO Architecture & Platform team, you’ll lead efforts in designing and building scalable, distributed, resilient services that provide artificial intelligence and machine learning capabilities on OCI & Oracle Cloud Applications for the business. You will be responsible for the design and development of machine learning systems and applications, ensuring they meet the needs of our clients and align with the company's strategic objectives. The ideal candidate will have extensive experience in machine learning algorithms, model creation and evaluation, data engineering and data processing for large scale distributed systems, and software development methodologies. We strongly believe in ownership and challenging the status quo. We expect you to bring critical thinking and long-term design impact while building solutions and products defining system integrations, and cross-cutting concerns. Being part of the architecture function also provides you with the unique ability to enforce new processes and design patterns that will be future-proof while building new services or products. As a thought leader, you will own and lead the complete SDLC from Architecture Design, Development, Test, Operational Readiness, and Platform SRE. Responsibilities As a member of the architecture team, you will be in charge of designing software products, services, and platforms, as well as creating, testing, and managing the systems and applications we create in line with the architecture patterns and standards. As a core member of the Architecture Chapter, you will be expected to advocate for the adoption of software architecture and design patterns among cross-functional teams both within and outside of engineering roles. You will also be expected to act as a mentor and act in capacity as an advisor to the team(s) within the software and AIML domain. As we push for digital transformation throughout the organization, you will constantly be expected to think creatively and optimize and harmonize business processes. Core Responsibilities Lead the development of machine learning models, integration with full stack software ecosystem, data engineering and contribute to the design strategy. Collaborate with product managers and development teams to identify software requirements and define project scopes. Develop and maintain technical documentation, including architecture diagrams, design specifications, and system diagrams. Analyze and recommend new software technologies and platforms to ensure the company stays ahead of the curve. Work with development teams to ensure software projects are delivered on time, within budget, and to the required quality standards. Provide guidance and mentorship to junior developers. Stay up-to-date with industry trends and developments in software architecture and development practices. Required Qualifications Bachelor's or Master's Degree in Computer Science, Machine Learning/AI, or a closely related field. 6+ years of experience in software development, machine learning, data science, and data engineering design. Proven ability to build and manage enterprise-distributed and/or cloud-native systems. Broad knowledge of cutting-edge machine learning models and strong domain expertise in both traditional and deep learning, particularly in areas such as Recommendation Engines, NLP & Transformers, Computer Vision, and Generative AI. Advanced proficiency in Python and frameworks such as FastAPI, Dapr & Flask or equivalent. Deep experience with ML frameworks such as PyTorch, TensorFlow, and Scikit-learn. Hands-on experience building ML models from scratch, transfer learning, and Retrieval Augmented Generation (RAG) using various techniques (Native, Hybrid, C-RAG, Graph RAG, Agentic RAG, and Multi-Agent RAG). Experience building Agentic Systems with SLMs and LLMs using frameworks like Langgraph + Langchain, AutoGen, LlamaIndex, Bedrock, Vertex, Agent Development Kit, Model Context Protocol (MCP)and Haystack or equivalent. Experience in Data Engineering using data lakehouse stacks such as ETL/ELT, and data processing with Apache Hadoop, Spark, Flink, Beam, and dbt. Experience with Data Warehousing and Lakes such as Apache Iceberg, Hudi, Delta Lake, and cloud-managed solutions like OCI Data Lakehouse. Experience in data visualization and analytics with Apache Superset, Apache Zeppelin, Oracle Analytics Cloud or similar. Hands-on experience working with various data types and storage formats, including NoSQL, SQL, Graph databases, and data serialization formats like Parquet and Arrow. Experience with real-time distributed systems using streaming data with Kafka, NiFi, or Pulsar. Strong expertise in software design concepts, patterns (e.g., 12-Factor Apps), and tools to create CNCF-compliant software with hands-on knowledge of containerization technologies like Docker and Kubernetes. Proven ability to build and deploy software applications on one or more public cloud providers (OCI, AWS, Azure, GCP, or similar). Demonstrated ability to write full-stack applications using polyglot programming with languages/frameworks like FastAPI, Python, and Golang. Experience designing API-first systems with application stacks like FARM and MERN, and technologies such as gRPC and REST. Solid understanding of Design Thinking, Test-Driven Development (TDD), BDD, and end-to-end SDLC. Experience in DevOps practices, including Kubernetes, CI/CD, Blue-Green, and Canary deployments. Experience with Microservice architecture patterns, including API Gateways, Event-Driven & Reactive Architecture, CQRS, and SAGA. Familiarity with OOP design principles (SOLID, DRY, KISS, Common Closure, and Module Encapsulation). Proven ability to design software systems using various design patterns (Creational, Structural, and Behavioral). Strong interpersonal skills and the ability to effectively communicate with business stakeholders. Demonstrated ability to drive technology adoption in AIML Solutions and CNCF software stack. Excellent analytical, problem-solving, communication, and leadership skills. Qualifications Career Level - IC4 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 4 days ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
By clicking the “Apply” button, I understand that my employment application process with Takeda will commence and that the information I provide in my application will be processed in line with Takeda’s Privacy Notice and Terms of Use. I further attest that all information I submit in my employment application is true to the best of my knowledge. Job Description: The Future Begins Here: At Takeda, we are leading digital evolution and global transformation. By building innovative solutions and future-ready capabilities, we are meeting the need of patients, our people, and the planet. Bengaluru, the city, which is India’s epicenter of Innovation, has been selected to be home to Takeda’s recently launched Innovation Capability Center. We invite you to join our digital transformation journey. In this role, you will have the opportunity to boost your skills and become the heart of an innovative engine that is contributing to global impact and improvement. At Takeda’s ICC we Unite in Diversity: Takeda is committed to creating an inclusive and collaborative workplace, where individuals are recognized for their backgrounds and abilities they bring to our company. We are continuously improving our collaborators journey in Takeda, and we welcome applications from all qualified candidates. Here, you will feel welcomed, respected, and valued as an important contributor to our diverse team. The Opportunity: As a Data Scientist, you will have the opportunity to apply your analytical skills and expertise to extract meaningful insights from vast amounts of data. We are currently seeking a talented and experienced individual to join our team and contribute to our data-driven decision-making process. Objectives: Collaborate with different business users, mainly Supply Chain/Manufacturing to understand the current state and identify opportunities to transform the business into a data-driven organization. Translate processes, and requirements into analytics solutions and metrics with effective data strategy, data quality, and data accessibility for decision making. Operationalize decision support solutions and drive use adoption as well as gathering feedback and metrics on Voice of Customer in order to improve analytics services. Understand the analytics drivers and data to be modeled as well as apply the appropriate quantitative techniques to provide business with actionable insights and ensure analytics model and data are access to the end users to evaluate “what-if” scenarios and decision making. Evaluate the data, analytical models, and experiments periodically to validate hypothesis ensuring it continues to provide business value as requirements and objectives evolve. Accountabilities: Collaborates with business partners in identifying analytical opportunities and developing BI-related goals and projects that will create strategically relevant insights. Work with internal and external partners to develop analytics vision and programs to advance BI solutions and practices. Understands data and sources of data. Strategizes with IT development team and develops a process to collect, ingest, and deliver data along with proper data models for analytical needs. Interacts with business users to define pain points, problem statement, scope, and analytics business case. Develops solutions with recommended data model and business intelligence technologies including data warehouse, data marts, OLAP modeling, dashboards/reporting, and data queries. Works with DevOps and database teams to ensure proper design of system databases and appropriate integration with other enterprise applications. Collaborates with Enterprise Data and Analytics Team to design data model and visualization solutions that synthesize complex data for data mining and discovery. Assists in defining requirements and facilitates workshops and prototyping sessions. Develops and applies technologies such as machine-learning, deep-learning algorithm to enable advanced analytics product functionality. EDUCATION, BEHAVIOURAL COMPETENCIES AND SKILLS : Bachelors’ Degree, from an accredited institution in Data Science, Statistics, Computer Science, or related field. 3+ years of experience with statistical modeling such as clustering, segmentation, multivariate, regression, etc. and analytics tools such as R, Python, Databricks, etc. required Experience in developing and applying predictive and prescriptive modeling, deep-learning, or other machine learning techniques a plus. Hands-on development of AI solutions that comply with industry standards and government regulations. Great numerical and analytical skills, as well as basic knowledge of Python Analytics packages (Pandas, scikit-learn, statsmodels). Ability to build and maintain scalable and reliable data pipelines that collect, transform, manipulate, and load data from internal and external sources. Ability to use statistical tools to conduct data analysis and identify data quality issues throughout the data pipeline. Experience with BI and Visualization tools (f. e. Qlik, Power BI), ETL, NoSQL and proven design skills a plus. Excellent written and verbal communication skills including the ability to interact effectively with multifunctional teams. Experience with working with agile teams. WHAT TAKEDA CAN OFFER YOU: Takeda is certified as a Top Employer, not only in India, but also globally. No investment we make pays greater dividends than taking good care of our people. At Takeda, you take the lead on building and shaping your own career. Joining the ICC in Bengaluru will give you access to high-end technology, continuous training and a diverse and inclusive network of colleagues who will support your career growth. BENEFITS: It is our priority to provide competitive compensation and a benefit package that bridges your personal life with your professional career. Amongst our benefits are Competitive Salary + Performance Annual Bonus Flexible work environment, including hybrid working Comprehensive Healthcare Insurance Plans for self, spouse, and children Group Term Life Insurance and Group Accident Insurance programs Health & Wellness programs including annual health screening, weekly health sessions for employees. Employee Assistance Program 3 days of leave every year for Voluntary Service in additional to Humanitarian Leaves Broad Variety of learning platforms Diversity, Equity, and Inclusion Programs Reimbursements – Home Internet & Mobile Phone Employee Referral Program Leaves – Paternity Leave (4 Weeks) , Maternity Leave (up to 26 weeks), Bereavement Leave (5 calendar days) ABOUT ICC IN TAKEDA: Takeda is leading a digital revolution. We’re not just transforming our company; we’re improving the lives of millions of patients who rely on our medicines every day. As an organization, we are committed to our cloud-driven business transformation and believe the ICCs are the catalysts of change for our global organization. Locations: IND - Bengaluru Worker Type: Employee Worker Sub-Type: Regular Time Type: Full time Show more Show less
Posted 4 days ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Gurgaon/Bangalore, India AXA XL recognizes data and information as critical business assets, both in terms of managing risk and enabling new business opportunities. This data should not only be high quality, but also actionable - enabling AXA XL’s executive leadership team to maximize benefits and facilitate sustained industrious advantage. Our Chief Data Office also known as our Innovation, Data Intelligence & Analytics team (IDA) is focused on driving innovation through optimizing how we leverage data to drive strategy and create a new business model - disrupting the insurance market. As we develop an enterprise-wide data and digital strategy that moves us toward greater focus on the use of data and data-driven insights, we are seeking an Data Engineer. The role will support the team’s efforts towards creating, enhancing, and stabilizing the Enterprise data lake through the development of the data pipelines. This role requires a person who is a team player and can work well with team members from other disciplines to deliver data in an efficient and strategic manner . What You’ll Be DOING What will your essential responsibilities include? Act as a data engineering expert and partner to Global Technology and data consumers in controlling complexity and cost of the data platform, whilst enabling performance, governance, and maintainability of the estate. Understand current and future data consumption patterns, architecture (granular level), partner with Architects to ensure optimal design of data layers. Apply best practices in Data architecture. For example, balance between materialization and virtualization, optimal level of de-normalization, caching and partitioning strategies, choice of storage and querying technology, performance tuning. Leading and hands-on execution of research into new technologies. Formulating frameworks for assessment of new technology vs business benefit, implications for data consumers. Act as a best practice expert, blueprint creator of ways of working such as testing, logging, CI/CD, observability, release, enabling rapid growth in data inventory and utilization of Data Science Platform. Design prototypes and work in a fast-paced iterative solution delivery model. Design, Develop and maintain ETL pipelines using Pyspark in Azure Databricks using delta tables. Use Harness for deployment pipeline. Monitor Performance of ETL Jobs, resolve any issue that arose and improve the performance metrics as needed. Diagnose system performance issue related to data processing and implement solution to address them. Collaborate with other teams to ensure successful integration of data pipelines into larger system architecture requirement. Maintain integrity and quality across all pipelines and environments. Understand and follow secure coding practice to make sure code is not vulnerable. You will report to Technical Lead. What You Will BRING We’re looking for someone who has these abilities and skills: Required Skills And Abilities Effective Communication skills. Bachelor’s degree in computer science, Mathematics, Statistics, Finance, related technical field, or equivalent work experience. Relevant years of extensive work experience in various data engineering & modeling techniques (relational, data warehouse, semi-structured, etc.), application development, advanced data querying skills. Relevant years of programming experience using Databricks. Relevant years of experience using Microsoft Azure suite of products(ADF, synapse and ADLS). Solid knowledge on network and firewall concepts. Solid experience writing, optimizing and analyzing SQL. Relevant years of experience with Python. Ability to break complex data requirements and architect solutions into achievable targets. Robust familiarity with Software Development Life Cycle (SDLC) processes and workflow, especially Agile. Experience using Harness. Technical lead responsible for both individual and team deliveries. Desired Skills And Abilities Worked in big data migration projects. Worked on performance tuning both at database and big data platforms. Ability to interpret complex data requirements and architect solutions. Distinctive problem-solving and analytical skills combined with robust business acumen. Excellent basics on parquet files and delta files. Effective Knowledge of Azure cloud computing platform. Familiarity with Reporting software - Power BI is a plus. Familiarity with DBT is a plus. Passion for data and experience working within a data-driven organization. You care about what you do, and what we do. Who WE are AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it. How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business − property, casualty, professional, financial lines and specialty. With an innovative and flexible approach to risk solutions, we partner with those who move the world forward. Learn more at axaxl.com What we OFFER Inclusion AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic. At AXA XL, we know that an inclusive culture and enables business growth and is critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most inclusive workforce possible, and create a culture where everyone can bring their full selves to work and reach their highest potential. It’s about helping one another — and our business — to move forward and succeed. Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe. Robust support for Flexible Working Arrangements Enhanced family-friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer. Total Rewards AXA XL’s Reward program is designed to take care of what matters most to you, covering the full picture of your health, wellbeing, lifestyle and financial security. It provides competitive compensation and personalized, inclusive benefits that evolve as you do. We’re committed to rewarding your contribution for the long term, so you can be your best self today and look forward to the future with confidence. Sustainability At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations. Our Pillars Valuing nature: How we impact nature affects how nature impacts us. Resilient ecosystems - the foundation of a sustainable planet and society - are essential to our future. We’re committed to protecting and restoring nature - from mangrove forests to the bees in our backyard - by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans. Addressing climate change: The effects of a changing climate are far-reaching and significant. Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption. We're building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions. Integrating ESG: All companies have a role to play in building a more resilient future. Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business. We’re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting. AXA Hearts in Action: We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL’s “Hearts in Action” programs. These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day - the Global Day of Giving. For more information, please see axaxl.com/sustainability. Show more Show less
Posted 4 days ago
8.0 years
0 Lacs
Mumbai, Maharashtra, India
Remote
About This Role Aladdin Data: BlackRock is one of the world’s leading asset management firms and Aladdin® is the firm’s an end-to-end operating system for investment professionals to see their whole portfolio, understand risk exposure, and act with precision. Aladdin is our operating platform to manage financial portfolios. It unites client data, operators, and technology needed to manage transactions in real time through every step of the investment process. Aladdin Data is at the core of the Aladdin platform, and increasingly, our ability to consume, store, analyze, and gain insight from data is a key component of our competitive advantage. Our mission is to deliver critical insights to our stakeholders, enabling them to make data-driven decisions. BlackRock’s Data Operations team is at the heart of our data ecosystem, ensuring seamless data pipeline operations across the firm. Within this team, the Process Engineering group focusses on building tools to enhance observability, improve operator experience, streamline operations, and provide analytics that drive continuous improvement across the organization. Key Responsibilities Strategic Leadership Drive the roadmap for process engineering initiatives that align with broader Data Operations and enterprise objectives. Partner on efforts to modernize legacy workflows and build scalable, reusable solutions that support operational efficiency, risk reduction, and enhanced observability. Define and track success metrics for operational performance and process health across critical data pipelines. Process Engineering & Solutioning Design and develop tools and products to support operational efficiency, observability, risk management, and KPI tracking. Define success criteria for data operations in collaboration with stakeholders across teams. Break down complex data challenges into scalable, manageable solutions aligned with business needs. Proactively identify operational inefficiencies and deliver data-driven improvements. Data Insights & Visualization Design data science solutions to analyze vendor data trends, identify anomalies, and surface actionable insights for business users and data stewards. Develop and maintain dashboards (e.g., Power BI, Tableau) that provide real-time visibility into vendor data quality, usage patterns, and operational health. Create metrics and KPIs that measure vendor data performance, relevance, and alignment with business needs. Quality Control & Data Governance Build automated QC frameworks and anomaly detection models to validate data integrity across ingestion points. Work with data engineering and governance teams to embed robust validation rules and control checks into pipelines. Reduce manual oversight by building scalable, intelligent solutions that detect, report, and in some cases self-heal data issues. Testing & Quality Assurance Collaborate with data engineering and stewardship teams to validate data integrity throughout ETL processes. Lead the automation of testing frameworks for deploying new datasets or new pipelines. Collaboration & Delivery Work closely with internal and external stakeholders to align technical solutions with business objectives. Communicate effectively with both technical and non-technical teams. Operate in an agile environment, managing multiple priorities and ensuring timely delivery of high-quality data solutions. Experience & Education 8+ years of experience in data engineering, data operations, analytics, or related fields, with at least 3 years in a leadership or senior IC capacity. Bachelor's or Master’s degree in a quantitative field (Computer Science, Data Science, Statistics, Engineering, or Finance). Experience working with financial market data providers (e.g., Bloomberg, Refinitiv, MSCI) is highly valued. Proven track record of building and deploying ML models. Technical Expertise Deep proficiency in SQL and Python, with hands-on experience in data visualization (Power BI, Tableau), cloud data platforms (e.g., Snowflake), and Unix-based systems. Exposure to modern frontend frameworks (React JS) and microservices-based architectures is a strong plus. Familiarity with various database systems (Relational, NoSQL, Graph) and scalable data processing techniques. Leadership & Communication Skills Proven ability to lead cross-functional teams and influence without authority in a global matrixed organization. Exceptional communication skills, with a track record of presenting complex technical topics to senior stakeholders and non-technical audiences. Strong organizational and prioritization skills, with a results-oriented mindset and experience in agile project delivery. Preferred Qualifications Certification in Snowflake or equivalent cloud data platforms Certification in Power BI or other analytics tools Experience leading Agile teams and driving enterprise-level transformation initiatives Our Benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law. Show more Show less
Posted 4 days ago
6.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Description Oracle Customer Success Services Building on the mindset that "Who knows Oracle …. better than Oracle?" Oracle Customer Success Services assists customers with their requirements for some of the most cutting-edge applications and solutions by utilizing the strengths of more than two decades of expertise in developing mission-critical solutions for enterprise customers and combining it with cutting-edge technology to provide our customers' speed, flexibility, resiliency, and security to enable customers to optimize their investment, minimize risk, and achieve more. The business was established with an entrepreneurial mindset and supports a vibrant, imaginative, and highly varied workplace. We are free of obligations, so we'll need your help to turn it into a premier engineering hub that prioritizes quality. Why? Oracle Customer Success Services Engineering is responsible for designing, building, and managing cutting-edge solutions, services, and core platforms to support the managed cloud business including but not limited to Oracle Cloud Infrastructure (OCI), Oracle Cloud Applications (SaaS) & Oracle Enterprise Applications. This position is for CSS Architecture Team, and we are searching for the finest and brightest technologists as we begin on the road of cloud-native digital transformation. We operate under a garage culture, rely on cutting-edge technology in our daily work, and provide a highly innovative, creative, and experimental work environment. We prefer to innovate and move quickly, putting a strong emphasis on scalability and robustness. We need your assistance to build a top-tier engineering team that has a significant influence. What? As a Principal Data Science & AIML Engineer within the CSS CDO Architecture & Platform team, you’ll lead efforts in designing and building scalable, distributed, resilient services that provide artificial intelligence and machine learning capabilities on OCI & Oracle Cloud Applications for the business. You will be responsible for the design and development of machine learning systems and applications, ensuring they meet the needs of our clients and align with the company's strategic objectives. The ideal candidate will have extensive experience in machine learning algorithms, model creation and evaluation, data engineering and data processing for large scale distributed systems, and software development methodologies. We strongly believe in ownership and challenging the status quo. We expect you to bring critical thinking and long-term design impact while building solutions and products defining system integrations, and cross-cutting concerns. Being part of the architecture function also provides you with the unique ability to enforce new processes and design patterns that will be future-proof while building new services or products. As a thought leader, you will own and lead the complete SDLC from Architecture Design, Development, Test, Operational Readiness, and Platform SRE. Responsibilities As a member of the architecture team, you will be in charge of designing software products, services, and platforms, as well as creating, testing, and managing the systems and applications we create in line with the architecture patterns and standards. As a core member of the Architecture Chapter, you will be expected to advocate for the adoption of software architecture and design patterns among cross-functional teams both within and outside of engineering roles. You will also be expected to act as a mentor and act in capacity as an advisor to the team(s) within the software and AIML domain. As we push for digital transformation throughout the organization, you will constantly be expected to think creatively and optimize and harmonize business processes. Core Responsibilities Lead the development of machine learning models, integration with full stack software ecosystem, data engineering and contribute to the design strategy. Collaborate with product managers and development teams to identify software requirements and define project scopes. Develop and maintain technical documentation, including architecture diagrams, design specifications, and system diagrams. Analyze and recommend new software technologies and platforms to ensure the company stays ahead of the curve. Work with development teams to ensure software projects are delivered on time, within budget, and to the required quality standards. Provide guidance and mentorship to junior developers. Stay up-to-date with industry trends and developments in software architecture and development practices. Required Qualifications Bachelor's or Master's Degree in Computer Science, Machine Learning/AI, or a closely related field. 6+ years of experience in software development, machine learning, data science, and data engineering design. Proven ability to build and manage enterprise-distributed and/or cloud-native systems. Broad knowledge of cutting-edge machine learning models and strong domain expertise in both traditional and deep learning, particularly in areas such as Recommendation Engines, NLP & Transformers, Computer Vision, and Generative AI. Advanced proficiency in Python and frameworks such as FastAPI, Dapr & Flask or equivalent. Deep experience with ML frameworks such as PyTorch, TensorFlow, and Scikit-learn. Hands-on experience building ML models from scratch, transfer learning, and Retrieval Augmented Generation (RAG) using various techniques (Native, Hybrid, C-RAG, Graph RAG, Agentic RAG, and Multi-Agent RAG). Experience building Agentic Systems with SLMs and LLMs using frameworks like Langgraph + Langchain, AutoGen, LlamaIndex, Bedrock, Vertex, Agent Development Kit, Model Context Protocol (MCP)and Haystack or equivalent. Experience in Data Engineering using data lakehouse stacks such as ETL/ELT, and data processing with Apache Hadoop, Spark, Flink, Beam, and dbt. Experience with Data Warehousing and Lakes such as Apache Iceberg, Hudi, Delta Lake, and cloud-managed solutions like OCI Data Lakehouse. Experience in data visualization and analytics with Apache Superset, Apache Zeppelin, Oracle Analytics Cloud or similar. Hands-on experience working with various data types and storage formats, including NoSQL, SQL, Graph databases, and data serialization formats like Parquet and Arrow. Experience with real-time distributed systems using streaming data with Kafka, NiFi, or Pulsar. Strong expertise in software design concepts, patterns (e.g., 12-Factor Apps), and tools to create CNCF-compliant software with hands-on knowledge of containerization technologies like Docker and Kubernetes. Proven ability to build and deploy software applications on one or more public cloud providers (OCI, AWS, Azure, GCP, or similar). Demonstrated ability to write full-stack applications using polyglot programming with languages/frameworks like FastAPI, Python, and Golang. Experience designing API-first systems with application stacks like FARM and MERN, and technologies such as gRPC and REST. Solid understanding of Design Thinking, Test-Driven Development (TDD), BDD, and end-to-end SDLC. Experience in DevOps practices, including Kubernetes, CI/CD, Blue-Green, and Canary deployments. Experience with Microservice architecture patterns, including API Gateways, Event-Driven & Reactive Architecture, CQRS, and SAGA. Familiarity with OOP design principles (SOLID, DRY, KISS, Common Closure, and Module Encapsulation). Proven ability to design software systems using various design patterns (Creational, Structural, and Behavioral). Strong interpersonal skills and the ability to effectively communicate with business stakeholders. Demonstrated ability to drive technology adoption in AIML Solutions and CNCF software stack. Excellent analytical, problem-solving, communication, and leadership skills. Qualifications Career Level - IC4 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 4 days ago
0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Job Description esign, implement, and maintain data pipelines for data ingestion, processing, and transformation in Azure. Work together with data scientists and analysts to understand the needs for data and create effective data workflows. Create and maintain data storage solutions including Azure SQL Database, Azure Data Lake, and Azure Blob Storage. Utilizing Azure Data Factory or comparable technologies, create and maintain ETL (Extract, Transform, Load) operations. Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data. Improve the scalability, efficiency, and cost-effectiveness of data pipelines. Monitoring and resolving data pipeline problems will guarantee consistency and availability of the data. Works in the area of Software Engineering, which encompasses the development, maintenance and optimization of software solutions/applications.1. Applies scientific methods to analyse and solve software engineering problems.2. He/she is responsible for the development and application of software engineering practice and knowledge, in research, design, development and maintenance.3. His/her work requires the exercise of original thought and judgement and the ability to supervise the technical and administrative work of other software engineers.4. The software engineer builds skills and expertise of his/her software engineering discipline to reach standard software engineer skills expectations for the applicable role, as defined in Professional Communities.5. The software engineer collaborates and acts as team player with other software engineers and stakeholders. Show more Show less
Posted 4 days ago
4.0 years
0 Lacs
Mumbai, Maharashtra, India
Remote
About This Role Aladdin Data Introduction: BlackRock is one of the world’s leading asset management firms and Aladdin® is the firm’s an end-to-end operating system for investment professionals to see their whole portfolio, understand risk exposure, and act with precision. Aladdin is our operating platform to manage financial portfolios. It unites client data, operators, and technology needed to manage transactions in real time through every step of the investment process. Aladdin Data is at the core of the Aladdin platform, and increasingly, our ability to consume, store, analyze, and gain insight from data is a key component of our competitive advantage. Our mission is to deliver critical insights to our stakeholders, enabling them to make data-driven decisions. BlackRock’s Data Operations team is at the heart of our data ecosystem, ensuring seamless data pipeline operations across the firm. Within this team, the Process Engineering group focusses on building tools to enhance observability, improve operator experience, streamline operations, and provide analytics that drive continuous improvement across the organization. Key Responsibilities Process Engineering & Solutioning Design and develop tools and products to support operational efficiency, observability, risk management, and KPI tracking. Define success criteria for data operations in collaboration with stakeholders across teams. Break down complex data challenges into scalable, manageable solutions aligned with business needs. Proactively identify operational inefficiencies and deliver data-driven improvements. Testing & Quality Assurance Collaborate with data engineering and stewardship teams to validate data integrity throughout ETL processes. Ensure compliance with data governance controls during testing and assist in resolving data quality issues. Collaboration & Delivery Work closely with internal and external stakeholders to align technical solutions with business objectives. Communicate effectively with both technical and non-technical teams. Operate in an agile environment, managing multiple priorities and ensuring timely delivery of high-quality data solutions. What We’re Looking For 4+ years in data engineering, data operations, data analytics, or related roles. Bachelor's degree in Computer Science, Information Technology, Finance, or a related field. Experience in financial services is a plus, but not required. Technical Skills Strong proficiency in SQL, Python, and data visualization tools (Power BI, Tableau). Experience with Unix environments and modern cloud platforms like Snowflake. Familiarity with frontend frameworks (React JS) is a plus. Understanding of database types (Relational, NoSQL, Graph) and modern data architecture. Familiarity with pipeline monitoring tools and logging frameworks (ELK, Grafana) Soft Skills Strong analytical and problem-solving skills with excellent attention to detail. Ability to clearly communicate complex concepts to varied audiences. Proven organizational skills and experience managing multiple projects in agile environments. Team player with a collaborative mindset. Preferred Qualifications Snowflake Certification Power BI Certification Experience with Agile development methodology Our Benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law. Show more Show less
Posted 4 days ago
6.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio Your Role And Responsibilities As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the world’s technology leader. Come to IBM and make a global impact! The ability to be a team player The ability and skill to train other people in procedural and technical topics Strong communication and collaboration skills Preferred Education Master's Degree Required Technical And Professional Expertise Strong knowledge and experience in database design, modelling and development using PL SQL. Minimum of 6 years. Proficiency with Oracle databases and tools, such as SQL Developer and Toad In-depth understanding of SQL tuning and optimization techniques Knowledge of database performance monitoring and troubleshooting Familiarity with ETL processes and data integration techniques and Strong analytical and problem-solving skills Preferred Technical And Professional Experience Ability to work in a fast-paced environment and meet deadlines Knowledge of agile software development practices is a plus Bachelor's degree in computer science or a related field is preferred, but not required Show more Show less
Posted 4 days ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Company Overview Domo's AI and Data Products Platform lets people channel AI and data into innovative uses that deliver a measurable impact. Anyone can use Domo to prepare, analyze, visualize, automate, and build data products that are amplified by AI. Position Summary Working as a member of Domo’s Client Services team, the Associate Technical Consultant will be focused on the implementation of fault tolerant, highly scalable solutions. The successful candidate will have a minimum of 3 years working hands-on with data. This individual will join an enthusiastic, fast-paced and dynamic team at Domo. A successful candidate will have demonstrated sustained exceptional performance, innovation, creativity, insight, good judgment. Key Responsibilities Partner with business users, technical teams to understand the data requirements and support solutions development; Assist in implementing best practices for data ingestion, transformation and semantic modelling; Aggregate, transform and prepare large data sets for use within Domo solutions; Ensure data quality and perform validation across pipelines and reports; Write Python scripts to automate governance processes; Ability to create workflows in DOMO to automate business processes; Build custom Domo applications or custom bricks to support unique client use cases; Develop Agent Catalysts to deliver generative AI-powered insights within Domo, enabling intelligent data exploration, narrative generation, and proactive decision support through embedded AI features; Continuously learn and apply best practices to drive customer enablement and success; Support the documentation of data pipelines and the development of artifacts for long-term customer enablement. Job Requirements 3+ years of experience supporting business intelligence systems in a BI or ETL Developer role; Expert SQL skills required; Expertise with Windows and Linux environments; Expertise with at least one of the following database technologies and familiarity with the others: relational, columnar and NoSQL (i.e. MySQL, Oracle, MSSQL, Vertica, MongoDB); Understanding of data modelling skills (i.e. conceptual, logical and physical model design - with both traditional 3rd normal form as well as dimensional modelling, such as star and snowflake); Experience dealing with large data sets; Goal oriented with strong attention to detail; Proven experience in effectively partnering with business teams to deliver their goals and outcomes; Bachelor's Degree in in Information Systems, Statistics, Computer Science or related field preferred OR equivalent professional experience; Excellent problem-solving skills and creativity; Ability to think outside the box; Ability to learn and adapt quickly to varied requirements; Thrive in a fast-paced environment. NICE TO HAVE Experience working with APIs; Experience working with Web Technologies (Javascript, Html, CSS); Experience with scripting technologies (Java, Python,R, etc.); Experience working with Snowflake, Data Bricks or Big Query is a plus; Experience defining scope and requirements for projects; Excellent oral and written communication skills, and comfort presenting to everyone from entry-level employees to senior vice presidents; Experience with statistical methodologies; Experience with a wide variety of business data (Marketing, Finance, Operations, etc); Experience with Large ERP systems (SAP, Oracle JD Edwards, Microsoft Dynamics, NetSuite, etc); Understanding of Data Science, Data Modelling and analytics. LOCATION: Pune, Maharashtra, India India Benefits & Perks Medical insurance provided Maternity and paternity leave policies Baby bucks: a cash allowance to spend on anything for every newborn or child adopted “Haute Mama”: cash allowance for maternity wardrobe benefit (only for women employees) Annual leave of 18 days + 10 holidays + 12 sick leaves Sodexo Meal Pass Health and Wellness Benefit One-time Technology Benefit: cash allowance towards the purchase of a tablet or smartwatch Corporate National Pension Scheme Employee Assistance Programme (EAP) Marriage leaves up to 3 days Bereavement leaves up to 5 days Domo is an equal opportunity employer. Show more Show less
Posted 4 days ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D And A) – Azure Data Engineer - Senior As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for candidates with strong technology and data understanding in big data engineering space, having proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Develop & deploy big data pipelines in a cloud environment using Azure Cloud services ETL design, development and migration of existing on-prem ETL routines to Cloud Service Interact with senior leaders, understand their business goals, contribute to the delivery of the workstreams Design and optimize model codes for faster execution Skills And Attributes For Success Overall 3+ years of IT experience with 2+ Relevant experience in Azure Data Factory (ADF) and good hands-on with Exposure to latest ADF Version Hands-on experience on Azure functions & Azure synapse (Formerly SQL Data Warehouse) Should have project experience in Azure Data Lake / Blob (Storage purpose) Should have basic understanding on Batch Account configuration, various control options Sound knowledge in Data Bricks & Logic Apps Should be able to coordinate independently with business stake holders and understand the business requirements, implement the requirements using ADF To qualify for the role, you must have Be a computer science graduate or equivalent with 3-7 years of industry experience Have working experience in an Agile base delivery methodology (Preferable) Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support. Ideally, you’ll also have Client management skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 4 days ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Project Description: Our client is an EU subsidiary of a Global Financial Bank working in multiple markets and asset classes. The Bank's Data Store has been transformed to a Data warehouse (DWH) which is the central source for Regulatory Reporting. It is also intended to be the core data integration platform which not only provide date for regulatory reporting but also provide data for Risk Modelling, Portfolio Analysis, Ad Hoc Analysis & Reporting (Finance, Risk, other), MI Reporting, Data Quality Management, etc. Due to high demand of regulatory requirements, a lot of regulatory projects are in progress to reflect regulatory requirements on existing regulatory reports and to develop new regulatory reports on MDS. Examples are IFRS9, AnaCredit, IRBB, the new Deposit Guarantee Directive (DGSD), Bank Data Retrieval Portal (BDRP) and the Fundamental Review of the Trading Book (FRTB). DWH / ETL Tester will work closely with the Development Team to design, build interfaces and integrate data from a variety from internal and external data sources into the new Enterprise Data Warehouse environment. The ETL Tester will be primarily responsible for testing Enterprise Data Warehouse using Automation within industry recognized ETL standards, architecture, and best practices. Responsibilities: Testing the Bank's data warehouse system changes, testing the changes (user stories), support IT integration testing in TST and support business stakeholders with User Acceptance Testing. It is hands-on position: you will be required to write and execute test cases, build test automation where it is applicable. Overall Purpose of Job - Test the MDS data warehouse system - Validate regulatory reports - Supporting IT and Business stakeholders during UAT phase - Contribute to improvement of testing and development processes - Work as part of a cross-functional team and take ownership of tasks - Contribute in Testing Deliverables. - Ensure the implementation of test standards and best practices for the agile model & contributes to their development. - Engage with internal stakeholders in various areas of the organization to seek alignment and collaboration. - Deals with external stakeholders / Vendors. - Identify risks / issues and present associated mitigating actions taking into account the critically of the domain of the underlying business. - Contribute to continuous improvement of testing standard processes. Additional responsibilities include work closely with the systems analysts and the application developers, utilize functional design documentation and technical specifications to facilitate the creation and execution of manual and automated test scripts, perform data analysis and creation of test data, track and help resolve defects and ensure that all testing is conducted and documented in adherence with the bank's standard. Mandatory Skills: Data Warehouse (DWH) ETL Test Management Mandatory Skills Description: Must have experience/expertise : Tester, Test Automation, Data Warehouse, Banking Technical: - At least 5 years of testing experience of which at least 2 years in the finance industry with good level knowledge on Data Warehouse, RDBMS concepts. - Strong SQL scripting knowledge and hands-on experience and experience with ETL & Databases. - Expertise on new age cloud based Data Warehouse solutions - ADF, SnowFlake, GCP etc. - Hands-On expertise in writing complex SQL using multiple JOINS and highly complex functions to test various transformations and ETL requirements. - Knowledge and Experience on creating Test Automation for Database and ETL Testing Regression Suite. - Automation using Selenium with Python (or Java Script), Python Scripts, Shell Script. - Knowledge of framework designing, REST API Testing of databases using Python. - Experience using Atlassian tool set, Azure DevOps and code & Version Management - GIT, Bitbucket, Azure Repos etc. - Help and provide inputs for the creation of Test Plan to address the needs of Cloud Based ETL Pipelines. Non-Technical: - Able to work in an agile environment - Experience in working in high priority projects (high pressure on delivery) - Some flexibility outside 9-5 working hours (Netherlands Time zone) - Able to work in demanding environment and have pragmatic approach with "can do" attitude. - Able to work independently and also to collaborate across the organization - Highly developed problem-solving skills with minimal supervision - Able to easily adapt to new circumstances / technologies / procedures. - Stress resistant and constructive - whatever the context. - Able to align with existing standards and acting with attention to detail. Nice-to-Have Skills Description: - Experience of financial regulatory reports - Experience in test automation for data warehouse (using bamboo) Software skills: - Bitbucket - Bamboo - Azure Tech Stack - Azure Data Factory - WKFS OneSumX reporting generator - Analytics tool such as Power BI / Excel / SSRS / SSAS, WinSCP Show more Show less
Posted 4 days ago
2.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Description About Amazon.com: Amazon.com strives to be Earth's most customer-centric company where people can find and discover virtually anything they want to buy online. By giving customers more of what they want - low prices, vast selection, and convenience - Amazon.com continues to grow and evolve as a world-class e-commerce platform. Amazon's evolution from Web site to e-commerce partner to development platform is driven by the spirit of innovation that is part of the company's DNA. The world's brightest technology minds come to Amazon.com to research and develop technology that improves the lives of shoppers and sellers around the world. About Team The RBS team is an integral part of Amazon online product lifecycle and buying operations. The team is designed to ensure Amazon remains competitive in the online retail space with the best price, wide selection and good product information. The team’s primary role is to create and enhance retail selection on the worldwide Amazon online catalog. The tasks handled by this group have a direct impact on customer buying decisions and online user experience. Overview Of The Role An candidate will be a self-starter who is passionate about discovering and solving complicated problems, learning complex systems, working with numbers, and organizing and communicating data and reports. You will be detail-oriented and organized, capable of handling multiple projects at once, and capable of dealing with ambiguity and rapidly changing priorities. You will have expertise in process optimizations and systems thinking and will be required to engage directly with multiple internal teams to drive business projects/automation for the RBS team. Candidates must be successful both as individual contributors and in a team environment, and must be customer-centric. Our environment is fast-paced and requires someone who is flexible, detail-oriented, and comfortable working in a deadline-driven work environment. Responsibilities Include Works across team(s) and Ops organization at country, regional and/or cross regional level to drive improvements and enables to implement solutions for customer, cost savings in process workflow, systems configuration and performance metrics. Basic Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field Proficiency in automation using Python Excellent oral and written communication skills Experience with SQL, ETL processes, or data transformation Preferred Qualifications Experience with scripting and automation tools Familiarity with Infrastructure as Code (IaC) tools such as AWS CDK Knowledge of AWS services such as SQS, SNS, CloudWatch and DynamoDB Understanding of DevOps practices, including CI/CD pipelines and monitoring solutions Understanding of cloud services, serverless architecture, and systems integration Basic Qualifications Experience defining requirements and using data and metrics to draw business insights Experience with SQL or ETL 2+ years of tax, finance or a related analytical field experience Preferred Qualifications Knowledge of Python, VBA, Macros, Selenium scripts Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI MAA 15 SEZ - K20 Job ID: A2935487 Show more Show less
Posted 4 days ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. A career in IBM Consulting embraces long-term relationships and close collaboration with clients across the globe. You will collaborate with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio, including IBM Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you will be supported by mentors and coaches who will encourage you to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground-breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and learning opportunities in an environment that embraces your unique skills and experience. Your Role And Responsibilities As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In This Role, Your Responsibilities May Include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Your Primary Responsibilities Include Develop & maintain data pipelines for batch & stream processing using informatica power centre or cloud ETL/ELT tools. Liaise with business team and technical leads, gather requirements, identify data sources, identify data quality issues, design target data structures, develop pipelines and data processing routines, perform unit testing and support UAT. Work with data scientist and business analytics team to assist in data ingestion and data-related technical issues. Preferred Education Master's Degree Required Technical And Professional Expertise Expertise in Data warehousing/ information Management/ Data Integration/Business Intelligence using ETL tool Informatica PowerCenter Knowledge of Cloud, Power BI, Data migration on cloud skills. Experience in Unix shell scripting and python Experience with relational SQL, Big Data etc Preferred Technical And Professional Experience Knowledge of MS-Azure Cloud Experience in Informatica PowerCenter Experience in Unix shell scripting and python Show more Show less
Posted 4 days ago
4.0 - 9.0 years
10 - 15 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
About Us: KPI Partners is a leading provider of data analytics and performance management solutions, dedicated to helping organizations harness the power of their data to drive business success. Our team of experts is at the forefront of the data revolution, delivering innovative solutions to our clients. We are currently seeking a talented and experienced Senior Developer / Lead Data Engineer with expertise in Incorta to join our dynamic team. Job Description: As a Senior Developer / Lead Data Engineer at KPI Partners, you will play a critical role in designing, developing, and implementing data solutions using Incorta. You will work closely with cross-functional teams to understand data requirements, build and optimize data pipelines, and ensure that our data integration processes are efficient and effective. This position requires strong analytical skills, proficiency in Incorta, and a passion for leveraging data to drive business insights. Key Responsibilities: - Design and develop scalable data integration solutions using Incorta. - Collaborate with business stakeholders to gather data requirements and translate them into technical specifications. - Create and optimize data pipelines to ensure high data quality and availability. - Perform data modeling, ETL processes, and data engineering activities to support analytics initiatives. - Troubleshoot and resolve data-related issues across various systems and environments. - Mentor and guide junior developers and data engineers, fostering a culture of learning and collaboration. - Stay updated on industry trends, best practices, and emerging technologies related to data engineering and analytics. - Work with the implementation team to ensure smooth deployment of solutions and provide ongoing support. Qualifications: - Bachelor's or Master's degree in Computer Science, Engineering, Information Systems, or a related field. - 5+ years of experience in data engineering or related roles with a strong focus on Incorta. - Expertise in Incorta and its features, along with experience in data modeling and ETL processes. - Proficiency in SQL and experience with relational databases (e.g., MySQL, Oracle, SQL Server). - Strong analytical and problem-solving skills, with the ability to work with complex data sets. - Excellent communication and collaboration skills to work effectively in a team-oriented environment. - Familiarity with cloud platforms (e.g., AWS, Azure) and data visualization tools is a plus. - Experience with programming languages such as Python, Java, or Scala is advantageous. Why Join KPI Partners? - Opportunity to work with a talented and passionate team in a fast-paced environment. - Competitive salary and benefits package. - Continuous learning and professional development opportunities. - A collaborative and inclusive workplace culture that values diversity and innovation. KPI Partners is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. Join us at KPI Partners and help us unlock the power of data for our clients!
Posted 4 days ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. A career in IBM Consulting embraces long-term relationships and close collaboration with clients across the globe. You will collaborate with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio, including IBM Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you will be supported by mentors and coaches who will encourage you to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground-breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and learning opportunities in an environment that embraces your unique skills and experience Your Role And Responsibilities As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In This Role, Your Responsibilities May Include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Your Primary Responsibilities Include Develop & maintain data pipelines for batch & stream processing using informatica power centre or cloud ETL/ELT tools. Liaise with business team and technical leads, gather requirements, identify data sources, identify data quality issues, design target data structures, develop pipelines and data processing routines, perform unit testing and support UAT. Work with data scientist and business analytics team to assist in data ingestion and data-related technical issues. Preferred Education Master's Degree Required Technical And Professional Expertise Expertise in Data warehousing/ information Management/ Data Integration/Business Intelligence using ETL tool Informatica PowerCenter Knowledge of Cloud, Power BI, Data migration on cloud skills. Experience in Unix shell scripting and python Experience with relational SQL, Big Data etc Preferred Technical And Professional Experience Knowledge of MS-Azure Cloud Experience in Informatica PowerCenter Experience in Unix shell scripting and python Show more Show less
Posted 4 days ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. A career in IBM Consulting embraces long-term relationships and close collaboration with clients across the globe. You will collaborate with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio, including IBM Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you will be supported by mentors and coaches who will encourage you to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground-breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and learning opportunities in an environment that embraces your unique skills and experience. Your Role And Responsibilities As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In This Role, Your Responsibilities May Include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Your Primary Responsibilities Include Develop & maintain data pipelines for batch & stream processing using informatica power centre or cloud ETL/ELT tools. Liaise with business team and technical leads, gather requirements, identify data sources, identify data quality issues, design target data structures, develop pipelines and data processing routines, perform unit testing and support UAT. Work with data scientist and business analytics team to assist in data ingestion and data-related technical issues. Preferred Education Master's Degree Required Technical And Professional Expertise Expertise in Data warehousing/ information Management/ Data Integration/Business Intelligence using ETL tool Informatica PowerCenter Knowledge of Cloud, Power BI, Data migration on cloud skills. Experience in Unix shell scripting and python Experience with relational SQL, Big Data etc Preferred Technical And Professional Experience Knowledge of MS-Azure Cloud Experience in Informatica PowerCenter Experience in Unix shell scripting and python Show more Show less
Posted 4 days ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Overview Prodapt is seeking a skilled GIS Full Stack Developer with extensive experience in GIS platforms such as ESRI, QGIS, and FME. The ideal candidate will have expertise in spatial database management using Oracle Spatial and PostgreSQL, proficiency in programming languages like Python, and experience with front-end technologies, web services, geospatial toolkits, and data integration. Responsibilities Software Development: Design, develop, and optimize software solutions for converting data between GIS systems. Testing and Debugging: Test and debug the conversion engine to ensure data accuracy and integrity. Documentation and Training: Document the conversion process and provide training to end-users as necessary. Collaboration: Work closely with cross-functional teams to gather requirements and develop solutions that align with business needs. Spatial Database Management: Design, develop, and maintain spatial databases, ensuring efficient storage and management of geographic data. Data Workflows: Building ETL Scripts: Create and manage ETL (Extract, Transform, Load) scripts to streamline data processing. Data Mapping: Design and implement data mapping solutions to ensure accurate data transfer between GIS systems. Data Cleansing: Perform data cleansing tasks to maintain the quality and integrity of geographic data. GIS Software Integration: Integrate GIS software like ArcGIS, QGIS with spatial databases, and customize front-end interfaces with custom commands. Front-End Development: Develop and customize front-end applications using React, JavaScript, and related technologies, integrating GIS functionalities. API Integration: Implement and integrate REST, SOAP, WFS, and WMS APIs for enhanced GIS functionality. Geospatial Toolkits: Utilize geospatial toolkits such as MapInfo, PostGIS, and GeoServer for application development and problem-solving. Continual Learning: Stay updated with the latest GIS technologies and methodologies to continually enhance the conversion engine and overall system performance. Requirements Required Skills: Proficiency in GIS platforms (ESRI, QGIS, FME). Expertise in spatial database management (Oracle Spatial, PostgreSQL). Strong programming skills (Python, .NET). Front-end development experience (React, JavaScript). Experience with geospatial/transfer data formats (JSON, XML). Familiarity with web services (WFS, WMS) and API integration. Experience in building ETL scripts, data mapping, and data cleansing. Ability to customize front-end interfaces and integrate GIS applications. Strong communication skills, both written and verbal. Domain: Knowledge of Telecom and utility networks. Show more Show less
Posted 4 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2