Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As an SQL Developer (Corporate Title: AS) at Deutsche Bank in Pune, India, you will be a part of the Legal CIO sub-domain for the DExTR programme. Your primary responsibility will be to contribute to the Enterprise Contract Management system aimed at supporting the legal department in negotiating, drafting, managing, and storing contracts efficiently. The objective of the project is to streamline the creation and amendments of legal documents, automate contract generation, enrich metadata, and enforce data quality through Document Assembly technology. Additionally, you will integrate negotiation workflow, document assembly, and document automation with standard desktop applications like MS Word and MS Outlook to enhance the user experience and facilitate the adoption of the application. Deutsche Bank is a client-centric global universal bank that prioritizes integrity, sustainable performance, and innovation. The Legal CIO department, within which you will be working, is responsible for managing legal risk, protecting the bank's culture, integrity, and reputation. The department is fully independent from the Business Divisions and reports directly to the Management Board. Your key responsibilities will include designing, developing, and optimizing complex SQL queries, procedures, and views for data extraction and transformation. You will collaborate with business analysts and stakeholders to understand reporting requirements, build and manage Query pipelines and ETL processes, and ensure data validation and quality assurance on SQL outputs and visualizations. To excel in this role, you should possess strong analytical, problem-solving, and communication skills. Extensive experience in SQL development, relational databases (Oracle), and data distribution techniques is required. Hands-on experience in creating data visualization and reports in Tableau and SAP BO is essential, along with knowledge of Cloud technologies, preferably GCP. Additionally, familiarity with Java Programming, Agile methodologies such as Scrum and Kanban, and version controlling tools like GIT and VCS will be beneficial. At Deutsche Bank, you will have access to comprehensive benefits, including a best-in-class leave policy, gender-neutral parental leaves, reimbursement under childcare assistance benefit, and sponsorship for industry-relevant certifications and education. You will also receive support through training, coaching, and a culture of continuous learning to aid in your career progression. If you are looking to join a client-centric global bank and contribute to innovative projects within the Legal CIO domain, Deutsche Bank offers a positive, fair, and inclusive work environment where you can excel together with a team of experts dedicated to empowering success every day. Visit our company website for more information: https://www.db.com/company/company.htm,
Posted 6 days ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
You are a highly skilled and motivated Data Engineer who will be responsible for designing & developing data transformations and data models to ensure reliable and efficient data processing and analysis. You will work closely with cross-functional teams to support data-driven decision-making processes and contribute to the overall success of insights teams. Your key proficiency and responsibilities include expertise in DBT (Data Build Tool) for data transformation and modeling, proficiency in Snowflake, a strong understanding of data architecture, modeling, and data warehousing best practices, designing, developing, and maintaining robust data pipelines using DBT and Snowflake, implementing and optimizing data ingestion processes, collaborating with stakeholders to ensure data integrity and quality, performing data analysis and profiling, documenting data workflows, models, and ETL processes, staying updated with the latest trends in data engineering, DBT, and Snowflake. You should have proven experience as a Data Engineer with a focus on data ingestion and ETL processes, experience with ETL tools and technologies like Apache Airflow, Talend, Informatica, proficiency in SQL and programming languages such as Python or Java, familiarity with cloud platforms and services (e.g., AWS) with experience on AWS Lambda, adhering to development best practices, conducting code reviews, participating in scrum methodology, and effectively communicating with team members and stakeholders. You are expected to take ownership of assigned tasks and work independently to complete them.,
Posted 6 days ago
5.0 - 9.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Data Engineer, you will play a crucial role in architecting end-to-end data platforms on Azure, utilizing Azure Databricks, ADF, and ADLS Gen2. Your primary focus will involve defining frameworks for data ingestion, transformation, and orchestration, ensuring efficient and scalable Cloud-native ETL/ELT design. Your expertise in SQL and Python will be essential in data modeling and optimization, contributing to the establishment of data quality and governance guidelines. Additionally, you will lead the design of data Lakehouse solutions encompassing both batch and streaming data processing. Collaborating with cross-functional teams, you will implement CI/CD practices and performance tuning strategies to enhance the overall data platform efficiency. Your innovative approach will be instrumental in shaping the future of data architecture within the organization.,
Posted 6 days ago
5.0 - 10.0 years
0 Lacs
maharashtra
On-site
As a Senior Associate Business Analyst in the General Insurance domain, you will be a key member of our consulting team based in Mumbai/Gurugram. With 5-10 years of experience in IT or Consulting, you will play a crucial role in driving digital transformation and data-led decision-making for our leading General Insurance clients. Your responsibilities will include developing a deep functional understanding of General Insurance operations such as policy administration, claims lifecycle, underwriting, billing, and regulatory reporting. You will lead data requirement gathering for projects involving data warehouses, data lakes, and operational data stores. Additionally, you will translate business needs into structured deliverables like Business Requirement Documents, Data Dictionaries, Source-to-target mappings, and more. To excel in this role, you should possess strong expertise in data projects, including requirement gathering, transformation logic, mapping, and validation. Your skills in SQL for data analysis and transformation, along with knowledge of data warehouse architectures and data modeling techniques, will be critical. Experience with Agile and Waterfall delivery models, as well as excellent documentation and communication skills, are essential for success. Preferred qualifications include experience with insurance core systems like Guidewire, understanding of regulatory requirements in the insurance industry (e.g., IFRS 17), exposure to cloud platforms like AWS or Azure, and familiarity with data visualization tools like Tableau or Power BI. Insurance certifications such as LOMA or III would be a plus. If you are a structured thinker with a strong foundation in data understanding and insurance domain knowledge, ready to collaborate with both business and technical teams to drive impactful digital transformations in the insurance value chain, we would like to hear from you. A B.E.(B.Tech)/M.E/M.Tech degree is required, while an MBA is preferable for this role.,
Posted 6 days ago
8.0 - 12.0 years
0 Lacs
kochi, kerala
On-site
At EY, you will be part of a globally connected powerhouse of diverse teams that will shape your future with confidence and help you succeed. As a Data Modeller Developer with Cloud Exposure at EY GDS Data and Analytics (D&A) team, you will play a crucial role in solving complex business challenges through data and technology. You will have the opportunity to work in various sectors like Banking, Insurance, Manufacturing, Healthcare, Retail, Supply Chain, and Finance. Your responsibilities will include developing and maintaining complex data models, collaborating with stakeholders to understand data requirements, evaluating data modelling tools and technologies, creating data dictionaries and metadata repositories, optimizing database performance, and staying updated with industry best practices in data modelling. You will also be expected to have hands-on experience in data modelling tools, proficiency in SQL, and familiarity with data warehousing and ETL processes. To qualify for this role, you must have at least 8 years of experience in data modelling, a strong understanding of SQL, proficiency in tools like Erwin/Visio or Power Designer, and experience in designing and implementing database structures. Cloud knowledge and certification, particularly Azure DP-203 certification, will be advantageous. Strong analytical and problem-solving skills, excellent communication, and documentation skills are essential for this role. Primary skills required for this position include Data Modelling and Advanced SQL. Additionally, client management skills and a willingness to learn new things in a fast-paced environment are desirable qualities. Working at EY offers you inspiring projects, education, coaching, personal development opportunities, and the freedom to handle your role in a way that suits you best. EY is committed to building a better working world by creating new value for clients, people, society, and the planet. With a focus on data, AI, and advanced technology, EY teams help clients shape the future confidently and address pressing issues of today and tomorrow. As part of a globally connected network, you will have the opportunity to work across various services in more than 150 countries and territories.,
Posted 6 days ago
5.0 - 10.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
At EY, you are part of a globally connected powerhouse where diverse teams come together to shape your future with confidence. Join EY to contribute towards building a better working world. As a SuccessFactors Employee Central Consultant within the Talent IT Service Delivery team, you will play a crucial role in the end-to-end delivery of programs/projects related to global applications and systems in the SAP technology stack. Your responsibilities will include coordinating and managing a team of configuration resources in SuccessFactors modules, ensuring the successful delivery of projects in collaboration with Project Managers, QA resources, and Support Teams. Your essential functions in this role will involve demonstrating leadership by inspiring and mentoring consultants throughout project lifecycles. You will serve as the primary point of contact for Employee Central projects, overseeing all phases from discovery to deployment and support. Managing and mentoring a team of consultants, developing project plans, timelines, and resource allocations will be key aspects of your role. Additionally, you will facilitate workshops with stakeholders to understand business requirements, analyze client requirements, advise on best practices, and design comprehensive Employee Central solutions. In terms of decision-making responsibilities, you will be expected to challenge and hold the Service Delivery team accountable, work within a matrix organization, prioritize relationships, manage project risks effectively, and demonstrate in-depth knowledge of EY competency principles and practices. Your ability to create an open, honest, accountable, and collaborative team environment will be essential as a leader and team player. To excel in this role, you should possess in-depth technical expertise in SAP Employee Central and ONB 2.0, a strong understanding of HR processes, and best practices. Flexibility, business analysis experience, and basic understanding of data modeling are also necessary. Your educational background should include a degree in Computer Science and/or a business-related field, along with at least 10 years of experience in application services and client/supplier relationship management in a technology environment. Moreover, you should have a minimum of 5 years of experience in implementing SAP SuccessFactors Employee Central or equivalent HRIS solutions, hands-on experience with Employee Central configuration, business rules, workflows, and integrations. Certification in SuccessFactors Employee Central would be a plus. This role may involve international travel, and fluency in English is required. By leveraging data, AI, and advanced technology, EY teams help clients shape the future with confidence and address critical issues. Join EY in building a better working world by creating new value for clients, people, society, and the planet while fostering trust in capital markets.,
Posted 6 days ago
10.0 - 16.0 years
12 - 18 Lacs
noida
Work from Office
About the Role: The Team: The team is responsible for building carbon trading platform using emerging tools and technologies. The team works in a significant environment that gives ample opportunities to use creative ideas to take on complex analytical problems. You will have the opportunity every single day to work with people from a wide variety of backgrounds and will be able to develop a close team dynamic with coworkers from around the globe. The Impact: You will be making meaningful contribution in building solutions for the User Interfaces/Webservices/API/Data Processing. The work you do will provide the capability to platform users to trade the carbon credits Whats in it for you: Build a career with a global company Work on code that fuels the global carbon markets Grow and improve your skills by working on enterprise level products and new technologies Attractive benefits package (Medical services, Special discounts for gyms, Meal vouchers) Ongoing Education (Participation in conferences and training) Access to the most interesting information technologies Flexible Working Hours Responsibilities: Architect, design and develop solutions within a multi-functional Agile team to support key business needs Design, and implement software components for different IT systems. Perform analysis and articulate solutions. Design underlying engineering for use in multiple product offerings supporting a large volume of end-users. Manage and improve existing solutions. Solve a variety of complex problems and figure out possible solutions, weighing the costs and benefits. Engineer components, and common services based on standard corporate development models, languages, and tools Apply software engineering best practices while also leveraging automation across all elements of solution delivery Collaborate effectively with technical and non-technical stakeholders. Must be able to document and demonstrate technical solutions by developing documentation, diagrams, code comments, etc. What Were Looking For: Basic Qualifications: Bachelor'sMasters Degree in Computer Science, Data Science or equivalent. 10 to 16 years of Full Stack Java, Springboot,, AWS, API development, restful services, data modelling persistence stores and ORMs Hands on experience with Java and related technologies. Have excellent communication and interpersonal skills Have strong analytical skills and learning agility. Must be hands on in coding specifically using NodeJS and related technologies. Have ability to work in a collaborative work environment Team leadership experience Knowledge and experience of deploying to cloud services, preferably AWS. Strong expertise and knowledge in Microservices Cloud experience in AWS or Azure, Optional Qualifications: Other JavaScript frameworks like Angular, Proficient with software development lifecycle (SDLC) methodologies like Agile, Test- driven development.
Posted 6 days ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As a data engineer at our company, you will play a crucial role in designing and implementing large-scale systems, particularly focusing on complex data pipelines. You will be responsible for driving projects from the initial stages to production, collaborating with stakeholders, analysts, and scientists to gather requirements and transform them into a data engineering roadmap. Your ability to communicate effectively, work well within a team, and showcase strong technical skills will be key in this role. Your primary responsibilities will include collaborating with various teams across different tech sites to achieve Objectives and Key Results (OKRs) that propel our company forward. You will enhance data layers to support the development of next-generation products resulting from our strategic initiatives. Additionally, you will be tasked with designing and constructing data pipelines to manage a range of tasks such as data extraction, cleansing, transformation, enrichment, and loading to meet specific business needs. To excel in this role, you should possess strong SQL proficiency, a solid understanding of Data Warehousing and Data Modelling concepts, and hands-on experience with the Hadoop tech stack, including HDFS, Hive, Oozie, Airflow, MapReduce, and Spark. Proficiency in programming languages such as Python, Java, and Scala is essential, along with experience in building ETL Data Pipelines and performance troubleshooting and tuning. Preferred qualifications for this position include familiarity with Data Warehouse (DW) or Business Intelligence (BI) tools like Anaplan, TM1, and Hyperion, as well as a track record of delivering high-quality end-to-end data solutions in an agile environment. You should be driven to optimize systems for efficiency, consistently propose and implement innovative ideas, mentor junior team members, and lead collaborative efforts with other engineers when necessary. If you are looking for a challenging role where you can leverage your data engineering skills to drive impactful projects and contribute to the growth of our organization, we encourage you to apply for this position.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
You are an experienced professional with a total of 5+ years of experience in Data engineering. You have hands-on working experience in SQL, Python or Scala and a deep understanding of Cloud Design Patterns and their implementation. You also possess experience working with Snowflake as a data warehouse solution and Power BI data integration. Your role will involve designing, developing, and maintaining scalable data pipelines and ETL processes, working with structured and unstructured data from multiple sources, and having a strong understanding of data modelling, warehousing, and relational database systems. Additionally, you have hands-on experience with ETL tools such as Apache Airflow, Talend, Informatica, or similar. You are known for your strong problem-solving skills and a passion for continuous improvement. Your excellent communication skills enable you to collaborate effectively with cross-functional teams. In this role, you will be responsible for writing and reviewing high-quality code, understanding the client's business use cases and technical requirements, and converting them into technical designs that elegantly meet the requirements. You will be expected to identify different solutions and select the best option that aligns with the clients" requirements, define guidelines for NFR considerations, and review architecture and design aspects to ensure best practices are followed. You will also be involved in developing and designing solutions for defined functional and non-functional requirements, as well as carrying out POCs to validate suggested designs and technologies. To qualify for this position, you should hold a Bachelors or Masters degree in computer science, Information Technology, or a related field.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
As an experienced MDM Developer, you will be responsible for developing and configuring Informatica MDM on-premises solutions to meet business requirements. Your role will involve implementing MDM workflows, match rules, business rules, and data validations using Informatica tools. You will also design and develop data integration workflows to load and synchronize data with the MDM Hub, integrating it with external systems such as ERP, CRM, and data warehouses. In addition to this, you will be required to implement data quality rules and processes using Informatica Data Quality (IDQ) to ensure clean and consistent master data. Supporting data governance initiatives will be a key part of your responsibilities, which includes maintaining metadata, data lineage, and audit trails in the MDM system. Your role will also involve testing and validation activities, including unit, integration, and system testing to validate MDM configurations and workflows. You will support user acceptance testing (UAT) and ensure timely resolution of identified issues. Monitoring and maintaining the MDM environment to ensure optimal performance and uptime will be essential, along with troubleshooting and resolving MDM-related issues. Documentation and collaboration are crucial aspects of the role, requiring you to create and maintain technical documentation, collaborate with business analysts, data stewards, and stakeholders to understand and address data management needs. To be successful in this role, you should have at least 5 years of experience in MDM development, with a minimum of 3 years working with Informatica MDM on-premise. Proficiency in Informatica MDM components such as Hub Console, IDD, Match & Merge, and Hierarchy Manager is essential. Strong knowledge of Informatica Data Quality (IDQ), data profiling, SQL, PL/SQL, relational databases (e.g., Oracle, Teradata), data modeling, data integration, and ETL processes is required. Hands-on experience implementing MDM solutions for domains like Customer, Product, or Employee, and familiarity with integrating MDM systems with enterprise applications will be beneficial. In addition to technical expertise, you should possess strong analytical and problem-solving skills, the ability to work independently and collaboratively in a team environment, and good communication and documentation skills.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
ahmedabad, gujarat
On-site
As a Software Professional at WhiteBlue, you will be an integral part of our exclusive cloud services company, dedicated to transforming the world with digital technology. We take pride in our ability to serve our customers with experienced and certified professional cloud talent. Our primary focus is on creating world-class human capital with hands-on cloud experience to build, migrate, and support complex cloud environments. At WhiteBlue, we are committed to providing tailored services and solutions to each customer, with a key emphasis on delivering measurable business outcomes. Joining our passionate team of software professionals means working in a collaborative environment where we strive to work, create, and inspire each other every day. We are purpose-driven and dedicated to achieving mutual success. Our corporate culture values each employee, and we are always willing to go the extra mile to ensure customer satisfaction. Leveraging our cutting-edge expertise, we deliver the best solutions to our clients with a relentless focus on outcomes and customer-centricity. At WhiteBlue, we pave the way for our customers" future in the realms of Cloud, Data, and IoT. Your role at WhiteBlue will involve designing, developing, and maintaining Power BI reports and dashboards for business users. You will be responsible for running large data platforms and related programs to provide business intelligence support, translating business requirements into effective visualizations using various data sources, creating data models, DAX calculations, and custom measures to support business analytics needs, optimizing performance, ensuring data accuracy in Power BI reports, troubleshooting and resolving data-related issues, staying updated on Power BI features and best practices, and training end-users on utilizing Power BI for self-service analytics. To excel in this role, you must demonstrate proficiency in Power BI Desktop and Power BI Service, a strong understanding of data modeling concepts and DAX language, experience with data visualization best practices, knowledge of SQL for data extraction and transformation, and the ability to work with various data sources such as Excel, SQL databases, and APIs. If you are someone who is passionate about leveraging Power BI to drive business insights, enjoys collaborating with a dynamic team, and is committed to delivering exceptional results for our customers, we welcome you to join us at WhiteBlue and be a part of our transformative journey in the digital realm.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
Aera Technology is the Decision Intelligence company enabling enterprises to operate sustainably, intelligently, and efficiently with our platform, Aera Decision Cloud. By integrating with existing systems, we digitize, augment, and automate decisions in real time, transforming decision-making for some of the world's leading brands. We deliver millions of recommendations resulting in significant revenue gains and cost savings globally. At Aera Technology, we empower the largest enterprises globally to revolutionize decision-making through Decision Intelligence. Our platform understands your business processes, provides real-time recommendations, predicts outcomes, and autonomously takes action, offering the necessary business agility to navigate today's dynamic environment. This role is based in our Pune office. Responsibilities: - Define detailed product requirements to deliver real-time prescriptive recommendations and drive decision execution - Understand market demands, target users, and use cases, gathering feedback continuously to address user pain points and business needs - Collaborate with engineering and design teams to drive product requirements and ensure timely release of capabilities - Work with other product managers to align and integrate with the Aera Cognitive Operating System - Collaborate with Data Scientists, Analysts, and Data Engineers to develop intelligent product capabilities - Monitor competition to gain market insights and help define innovative features within Aera - Engage with key stakeholders like product marketing to create product materials and resources - Conduct product demos and present roadmaps to customers and prospects regularly for feedback and advocacy. About You: - Leader with a track record of working across functions, overcoming barriers, and driving results - Passionate advocate for your products, dedicated to delivering excellence and value - Blend of business and technical acumen with a focus on execution - Pragmatic approach balancing nuanced details and strategic perspectives - Eager learner who seeks new information, feedback, and optimal solutions - Excellent communicator and collaborator adept at motivating teams towards common goals. Experience: - 10+ years in Product Management, Engineering, or Implementation roles, with 4+ years in analytics or data-driven products - Strong technical background encompassing data connectors, data warehousing, APIs, and related technologies - Deep understanding of data, analytics, and business process management space - Experience with decision management and BPM products is advantageous - Bachelor's degree in Engineering/Computer Science or related technical discipline. If you resonate with our vision of creating a sustainable, intelligent, and efficient world, Aera Technology welcomes you. Join us in our journey as a series D start-up established in 2017, with global teams across various locations. Let's build together! Benefits Summary: - Competitive salary, company stock options, and comprehensive medical coverage - Group Medical Insurance, Term Insurance, Accidental Insurance, paid time off, and Maternity leave - Unlimited access to online professional courses and people manager development programs - Flexible working environment supporting work-life balance - Fully-stocked kitchen with snacks and beverages in the office.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Process Engineer/Data Analyst in the Organization Design & Enterprise Performance (OA&EP) team within the Office of the COO, you will play a vital role in driving Data and Reporting requirements to support the operations of various projects and verticals. Your attention to detail, organizational skills, and analytical mindset will be crucial in creatively analyzing data to derive meaningful conclusions. Your ability to work independently and influence stakeholders while ensuring business benefits are delivered will be paramount. You will be responsible for a robust mix of technical and communication skills, with a focus on optimization, data analysis, resolving business queries, data storytelling, and data visualization. Managing the project mailbox, supporting the monthly change control process, and demonstrating solid analytical capabilities will be key aspects of your role. Additionally, understanding Accenture's organizational structure, staying updated on industry trends, and fostering a culture of continuous learning and improvement within the team will be essential. Collaboration with team members in multi-functional initiatives, reviewing project status reporting, identifying issues, and managing relationships with stakeholder groups will be part of your daily tasks. You will work independently with general mentorship on new projects and requirements, contributing as an individual within the OA&EP team. Your support in project communications, stakeholder management, and ensuring relevant parties are informed and engaged will be critical to project success. Key Skills required for this role include excellent proficiency in MS Excel, PowerPoint dashboards, and Word, as well as a strong background in data modeling, dashboard creation, data visualization, and management reporting. A fast learning ability and transformation into a functional or domain expert, coupled with a strong analytical mindset, problem-solving skills, attention to detail, and big picture thinking, will set you up for success in this role.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
You will be responsible for designing, building, and maintaining scalable data pipelines using AWS services. Your role will involve integrating diverse data sources to ensure data consistency and reliability. Collaboration with data scientists and stakeholders to understand data requirements will be essential. Implementing data security measures and maintaining data integrity will be crucial aspects of your job. Monitoring and troubleshooting data pipelines to ensure optimal performance will also be part of your responsibilities. Additionally, you will be expected to optimize and maintain data warehouse and data lake architectures while creating and maintaining comprehensive documentation for data engineering processes. To qualify for this role, you must hold a Bachelor's degree in computer science, Information Technology, or a related field. Proven experience as a Data Engineer with a focus on AWS is required. A strong understanding of data modelling, ETL processes, and data warehousing is essential. Experience with SQL and NoSQL databases is necessary, along with familiarity with data governance and data security best practices. Proficiency in AWS services such as Redshift, S3, RDS, Glue, Lambda, and API Gateway is expected. Experience with data pipeline orchestration tools like Apache Airflow and proficiency in programming languages such as Python or Java will be advantageous.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. As part of our GDS Consulting team, you will be part of NCLC team delivering specific to Microsoft account. You will be working on latest Microsoft BI technologies and will collaborate with other teams within Consulting services. The opportunity We're looking for resources with expertise in Microsoft BI, Power BI, Azure Data Factory, Data Bricks to join the group of our Data Insights team. This is a fantastic opportunity to be part of a leading firm whilst being instrumental in the growth of our service offering. Your key responsibilities - Responsible for managing multiple client engagements. - Understand and analyze business requirements by working with various stakeholders and create the appropriate information architecture, taxonomy, and solution approach. - Work independently to gather requirements, cleanse extraction and loading of data. - Translate business and analyst requirements into technical code. - Create interactive and insightful dashboards and reports using Power BI, connecting to various data sources and implementing DAX calculations. - Design and build complete ETL/Azure Data Factory processes moving and transforming data for ODS, Staging, and Data Warehousing. - Design and development of solutions in Data Bricks, Scala, Spark, SQL to process and analyze large datasets, perform data transformations, and build data models. - Design SQL Schema, Database Schema, Stored procedures, function, and T-SQL queries. Skills and attributes for success - Collaborating with other members of the engagement team to plan the engagement and develop work program timelines, risk assessments, and other documents/templates. - Able to manage Senior stakeholders. - Experience in leading teams to execute high-quality deliverables within stipulated timelines. - Skills in PowerBI, Azure Data Factory, Databricks, Azure Synapse, Data Modeling, DAX, Power Query, Microsoft Fabric. - Strong proficiency in Power BI, including data modeling, DAX, and creating interactive visualizations. - Solid experience with Azure Databricks, including working with Spark, PySpark (or Scala), and optimizing big data processing. - Good understanding of various Azure services relevant to data engineering, such as Azure Blob Storage, ADLS Gen2, Azure SQL Database/Synapse Analytics. - Strong SQL Skills and experience with one of the following: Oracle, SQL, Azure SQL. - Good to have experience in SSAS or Azure SSAS and Agile Project Management. - Basic Knowledge of Azure Machine Learning services. - Excellent Written and Communication Skills and ability to deliver technical demonstrations. - Quick learner with a can-do attitude. - Demonstrating and applying strong project management skills, inspiring teamwork and responsibility with engagement team members. To qualify for the role, you must have - A bachelor's or master's degree. - A minimum of 4-7 years of experience, preferably a background in a professional services firm. - Excellent communication skills with consulting experience preferred. Ideally, you'll also have - Analytical ability to manage multiple projects and prioritize tasks into manageable work products. - Can operate independently or with minimum supervision. What working at EY offers At EY, we're dedicated to helping our clients, from startups to Fortune 500 companies, and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: - Support, coaching, and feedback from some of the most engaging colleagues around. - Opportunities to develop new skills and progress your career. - The freedom and flexibility to handle your role in a way that's right for you. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Campaign Operator specialized in Salesforce Marketing Cloud, you will play a crucial role in transforming Martech stack operations, establishing innovative practices, and expanding operations to diverse markets. Your responsibilities will include designing, configuring, and implementing Marketing Automation studio solutions utilizing the Salesforce Marketing Cloud (SFMC) Platform, encompassing email campaigns, Journey Builder, Cloud Pages, SMS, and push notification campaigns. Your expertise in data modelling, integration architectures, and data governance processes will be pivotal in collaborating with team members to ensure seamless campaign execution from initial requirements to configuration and distribution. Working closely with IT teams, you will integrate data from various sources, ensuring data quality and accuracy through cleaning, transforming, and standardizing data as necessary. An essential aspect of your role will involve building solutions in collaboration with the IT team, utilizing out-of-the-box features and modules in Salesforce Data/Marketing Cloud while customizing them to meet specific requirements. Proficiency in Salesforce connectors, Code languages (such as Apex and JavaScript), knowledge of JSON data structure, SQL, and SOQL will enable you to optimize business processes with a focus on data integration architecture. Your experience of over 3 years in marketing automation, including 1-2 years with SFMC, coupled with excellent business acuity and communication skills, will be invaluable in driving customer-centric campaigns and enhancing customer experience. Additionally, your familiarity with AMP scripts in journey builder or profile segmentation, along with the ability to engage with customers to suggest optimization strategies, will contribute significantly to the technical architecture and solution design specific to Data Cloud and Marketing Cloud. As part of the AstraZeneca team, you will have the opportunity to work in a dynamic and innovative environment that combines cutting-edge science with digital technology platforms. By embracing new technologies and processes, you will have the chance to redefine roles, work processes, and contribute to cross-company change within the industry. If you are ready to make a difference and be part of a team that is dedicated to developing life-changing medicines, we encourage you to apply now and join us in our unique and daring world.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The Senior Data Engineer at EY is responsible for ingesting, building, and supporting large-scale data architectures that serve multiple downstream systems and business users. This individual collaborates with Data Engineer Leads and partners with Visualization on data quality and troubleshooting needs. Your key responsibilities include: - Cleaning, aggregating, and organizing data from various sources and transferring it to data warehouses. - Supporting the development, testing, and maintenance of data pipelines and platforms to ensure data quality for business dashboards and tools. - Creating, maintaining, and supporting the data platform and infrastructure that enables analytics front-end, including high-volume, large-scale data processing and databases with proper verification and validation processes. In terms of Data Engineering, you will be expected to: - Develop and maintain scalable data pipelines following ETL principles using AWS native technologies. - Define data requirements, gather, mine data, and validate the efficiency of data tools in the Big Data Environment. - Lead the evaluation, implementation, and deployment of emerging tools and processes to enhance productivity. - Implement processes and systems to ensure accurate and available data for key stakeholders and downstream systems. Furthermore, you will be required to collaborate with various teams within the organization, including Business Analytics, Solution Architects, Data Scientists, and AI/ML engineers, to develop technical architectures and solutions that enable advanced analytics, machine learning, and predictive modeling. To qualify for this role, you must have the following essential skillsets: - Bachelor's degree in Engineering, Computer Science, Data Science, or a related field. - 2+ years of experience in software development, data engineering, ETL, and analytics reporting development. - Experience with data engineering programming languages, distributed data technologies, cloud platforms, relational SQL databases, DevOps, continuous integration, and AWS cloud services. - Strong organizational, communication, problem-solving, and troubleshooting skills. Desired skillsets include a Master's degree in a related field and experience in a global working environment. EY is dedicated to building a better working world by creating long-term value for clients, people, and society through diverse teams in over 150 countries. If you are passionate about data and technology and want to contribute to transformative solutions, join us on this journey.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
You will be joining ITSource Technologies Limited, a client-oriented IT services, BPO, and IT staffing company known for its commitment to excellence. With a track record of over 22 years, we specialize in Enterprise Applications, Big Data, Staffing, BI, Cloud, and Web Solutions. Our industry reputation is built on a customer-centric approach and exceptional talent management capabilities. As an AWS Data Architect based in Pune, this full-time on-site role will involve overseeing data governance, data architecture, data modeling, Extract Transform Load (ETL), and data warehousing processes. Your responsibilities will include applying data modeling techniques to address new business requirements, utilizing SQL, Advanced SQL, and NoSQL databases, managing Data Warehousing and ETL tasks, engaging with customers to grasp business needs, and working with Tableau and Power BI tools. To excel in this role, you should possess skills in Data Governance, Data Architecture, Data Modeling, and ETL processes. Strong analytical and problem-solving abilities are essential, as well as familiarity with AWS services and infrastructure. Effective communication and collaboration skills are key, along with a Bachelor's degree in Computer Science or a related field. An AWS certification would be considered a plus.,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
noida, uttar pradesh
On-site
NTT DATA is looking for a Software Dev. Sr. Specialist Advisor to join the team in Noida, Uttar Pradesh, India. As an AWS Redshift Datalake engineer, you will work with the Data team to create and maintain scalable data pipelines dealing with petabytes of data. The projects involve cutting-edge technologies, petabyte-scale data processing systems, data warehouses, and data lakes to meet the growing information needs of customers. You should have 5+ years of hands-on experience in AWS Redshift, including data loading into Redshift and ETL experience. Additionally, you should be proficient in data modeling, SQL stored procedures, basic Python, and JSON files manipulation. Experience working in agile teams is required. Nice to have skills include Airflow, Kafka, and Tableau. Your responsibilities will include participating in daily scrum meetings, design and development activities, providing expert advice to resolve technical bottlenecks, implementing POCs to address technical debts, and mentoring the team to enhance AWS knowledge. The ideal candidate will have a Bachelor of Engineering/Technology degree with a focus on Computer Science or Software Engineering (or equivalent). NTT DATA is a trusted global innovator of business and technology services, serving 75% of the Fortune Global 100. With experts in over 50 countries and a robust partner ecosystem, NTT DATA offers consulting, data and artificial intelligence, industry solutions, application development, infrastructure management, and more. As a part of the NTT Group, NTT DATA invests significantly in R&D to support organizations and society in the digital future. Visit us at us.nttdata.com.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
kolkata, west bengal
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We're looking for a Senior expertise in Data analytics to create and manage large BI and analytics solutions using Visualization Tools such as OBIEE/OAC that turn data into knowledge. In this role, you should have a background in data and business analysis. You should be analytical and an excellent communicator. Having business acumen and problem-solving aptitude would be a plus. Your key responsibilities - Need to work as a team member and Lead to contribute in various technical streams of OBIEE/OAC implementation projects. - Provide product and design level technical best practices. - Interface and communicate with the onsite coordinators. - Completion of assigned tasks on time and regular status reporting to the lead. Skills and attributes for success - Use an issue-based approach to deliver growth, market, and portfolio strategy engagements for corporates. - Strong communication, presentation, and team building skills and experience in producing high-quality reports, papers, and presentations. - Exposure to BI and other visualization tools in the market. - Building a quality culture. - Foster teamwork. - Participating in the organization-wide people initiatives. To qualify for the role, you must have - BE/BTech/MCA/MBA with adequate industry experience. - Should have at least around 3 to 7 years of experience in OBIEE/OAC. - Experience in Working with OBIEE, OAC end-to-end implementation. - Understanding ETL/ELT Process using tools like Informatica/ODI/SSIS. - Should have knowledge of reporting, dashboards, RPD logical modeling. - Experience with BI Publisher. - Experience with Agents. - Experience in Security implementation in OAC/OBIEE. - Ability to manage self-service data preparation, data sync, data flow, and working with curated data sets. - Manage connections to multiple data sources - cloud, non-cloud using available various data connector with OAC. - Experience in creating pixel-perfect reports, manage contents in the catalog, dashboards, prompts, calculations. - Ability to create a data set, map layers, multiple data visualization, story in OAC. - Good understanding of various data models e.g. snowflakes, data marts, star data models, data lakes, etc. - Excellent written and verbal communication. - Having Cloud experience is an added advantage. - Migrating OBIEE on-premise to Oracle analytics in the cloud. - Knowledge and working experience with Oracle autonomous database. - Strong knowledge in DWH concepts. - Strong data modeling skills. - Familiar with Agile and Waterfall SDLC processes. - Strong SQL/PLSQL with analytical skill. Ideally, you'll also have - Experience in Insurance and Banking domains. - Strong hold in project delivery and team management. - Excellent written and verbal communication skills.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
coimbatore, tamil nadu
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. As a Senior Visualization Developer, your main objective is to transform quality datasets into creative, user-friendly dashboards that address business needs and drive informed decision-making across the enterprise. Reporting to the Lead Visualization Developer, you will collaborate with Data Engineering teams to ensure data quality and troubleshoot as needed. Your responsibilities include: - Building user tools, dashboards, and automatic report generation tools to analyze and present data associated with product and solution performance, business operations, and data-driven decision-making. - Conducting in-depth quantitative analysis and producing clear, detailed visualizations to facilitate informed decision-making. - Engaging with other Business Units to gather business objective requirements, influence service management, and build tools to support the needs of partner teams. Key Responsibilities: - Developing and implementing data visualization solutions using tools like Power BI to meet the needs of business stakeholders. - Conducting pre-production testing and ensuring data accuracy against sources. - Supporting wireframe creation and following best practices for Data Visualization. - Designing and implementing complex data visualizations for large-scale data sets. - Monitoring and analyzing the performance of Power BI solutions and optimizing as needed. - Ensuring accurate source data retrieval using programming functions like SQL, DAX, and Python. - Staying updated on emerging data visualization tools and technologies. - Collaborating with various teams to translate requirements into technical specifications. - Managing demand intake and prioritizing requests. - Applying best practices in data modeling and optimization. - Communicating and delivering reports to stakeholders effectively. Skills and attributes required: - Technical proficiency in testing and developing reports. - Strong business acumen, preferably in Pharmaceutical, Healthcare, or Life Sciences. - Ability to create visually appealing dashboards using Power BI. - Leadership skills to drive strategic data visualization projects. - Decision-making based on data-driven insights. - Track record of creating interactive data sets and storyboards. - Passion for innovation and leveraging digital technologies. - Comfort with complex data integration and analysis. Qualifications: - Bachelor's degree in computer science, Data Science, Information Systems, or related field. - 3+ years of experience as a Visualization/Power BI developer. - Strong understanding of data modeling and integration concepts. - Proficiency in Power BI, SQL, Python, and data engineering. - Excellent communication and organizational skills. - Self-motivated and able to work independently in an agile environment. Desired qualifications include experience in a global working environment and proficiency in Python. Join EY and be a part of building a better working world by leveraging data and technology to drive innovation and growth across various sectors.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
Join us as a Senior Developer at Barclays, where you will play a crucial role in supporting the successful delivery of Location Strategy projects while adhering to plan, budget, quality, and governance standards. Your primary responsibility will be to drive the evolution of our digital landscape, fostering innovation and excellence. By leveraging cutting-edge technology, you will lead the transformation of our digital offerings, ensuring unparalleled customer experiences. To excel in this role as a Senior Developer, you should possess the following experience and skills: - Solid hands-on development experience with Scala, Spark, Python, and Java. - Excellent working knowledge of Hadoop components such as HDFS, HIVE, Impala, HBase, and Data frames. - Proficiency in Jenkins builds pipeline or other CI/CD tools. - Sound understanding of Data Warehousing principles and Data Modeling. Additionally, highly valued skills may include: - Experience with AWS services like S3, Athena, DynamoDB, Lambda, and DataBricks. - Working knowledge of Jenkins, Git, and Unix. Your performance may be assessed based on critical skills essential for success in this role, including risk and controls management, change and transformation capabilities, business acumen, strategic thinking, and proficiency in digital and technology aspects. This position is based in Pune. **Purpose of the Role:** The purpose of this role is to design, develop, and enhance software solutions using various engineering methodologies to deliver business, platform, and technology capabilities for our customers and colleagues. **Accountabilities:** - Develop and deliver high-quality software solutions using industry-aligned programming languages, frameworks, and tools. Ensure that the code is scalable, maintainable, and optimized for performance. - Collaborate cross-functionally with product managers, designers, and other engineers to define software requirements, devise solution strategies, and ensure seamless integration with business objectives. - Engage in peer collaboration, participate in code reviews, and promote a culture of code quality and knowledge sharing. - Stay updated on industry technology trends, contribute to the organization's technology communities, and foster a culture of technical excellence and growth. - Adhere to secure coding practices to mitigate vulnerabilities, protect sensitive data, and deliver secure software solutions. - Implement effective unit testing practices to ensure proper code design, readability, and reliability. **Assistant Vice President Expectations:** As an Assistant Vice President, you are expected to: - Provide consultation on complex issues, offering advice to People Leaders to resolve escalated matters. - Identify and mitigate risks, develop new policies/procedures to support the control and governance agenda. - Take ownership of risk management and control strengthening related to the work undertaken. - Engage in complex data analysis from various internal and external sources to creatively solve problems. - Communicate complex information effectively to stakeholders. - Influence or convince stakeholders to achieve desired outcomes. All colleagues are expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as embrace the Barclays Mindset to Empower, Challenge, and Drive guiding principles for our behavior.,
Posted 1 week ago
12.0 - 16.0 years
0 Lacs
jaipur, rajasthan
On-site
You should have a minimum of 12 years of experience in Process Mining, with proficiency in tools like Celonis or similar software. A thorough understanding of the data lifecycle is essential, as well as experience in data extraction and data modeling, which are expected from a data architect. Key skills required for this role include expertise in Process Mining tools, Celonis, data lifecycle, data extraction, and data modeling. This position is located in Pan India.,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
bhubaneswar
On-site
At Dark Matter Technologies, you are part of a cutting-edge revolution in loan origination. The company's commitment to advanced AI and origination technology solutions is reshaping the industry landscape, paving the way for seamless, efficient, and automated experiences. As a Senior Data Scientist at Dark Matter Technologies, you will play a pivotal role in applying data science and machine learning techniques to tackle complex business challenges and foster innovation. With a minimum of 10 years of experience in the field, your expertise will be pivotal in designing, developing, and implementing sophisticated models for regression, predictions, ranking, and trend analysis. Your responsibilities will include selecting features, building, and optimizing classifiers using machine learning techniques, conducting hypothesis testing, and developing predictive models. Collaborating with cross-functional teams, including data engineers, front-end developers, and security specialists, you will integrate AI models into business solutions. Additionally, you will conduct research and experiments to enhance AI models and algorithms, staying updated on the latest AI/ML technologies and trends. To excel in this role, you must possess at least 10 years of experience in Predictive Analytics and Data Mining, with hands-on expertise in areas such as Linear Regression, Classification Techniques, Logistic Regression, and Unsupervised Learning techniques like k-means/hierarchical clustering. Proficiency in Optimization methods, working in R/Python, T-SQL, and graph-based models is essential. Hands-on experience with Python, Azure Machine Learning, and other Azure AI tools is highly desirable. Your role will involve developing and deploying machine learning models for various use cases, optimizing models through advanced feature engineering, working with large-scale datasets, and collaborating with stakeholders to design data science solutions that address organizational needs. Strong problem-solving skills, solid understanding of statistical analysis and machine learning algorithms, and excellent communication skills are key attributes for success in this position. Preferred skills include exposure to machine learning libraries like Scikit-learn, TensorFlow, or PyTorch, leveraging Azure Machine Learning for data pipelines, experiments, and deployments, and implementing end-to-end AI solutions using Azure AI services. Certification in Microsoft Azure AI relative to Azure Machine Learning, expert database knowledge in SQL, and proficiency in Azure Data Services are advantageous. Join Dark Matter Technologies to be at the forefront of data science innovation and contribute to reshaping the future of loan origination through advanced AI technologies and solutions.,
Posted 1 week ago
0.0 - 4.0 years
0 Lacs
karnataka
On-site
As an Audit Intern (Automation) at Standard Chartered within the Group Internal Audit (GIA) team, you will play a vital role in supporting the Board and Executive Management by safeguarding the Group's assets, reputation, and sustainability. GIA serves as the third line of defence within the Bank, offering independent assurance on the effectiveness of management control over business activities and risk frameworks. The 6-Month Internship Programme provides you with a hands-on experience to work on real-world challenges, refine your skills, build lasting connections, gain valuable insights, and explore your capabilities. Successful performance during the programme may lead to a permanent job offer upon graduation. Throughout the internship, you will delve into the workings of the GIA team, understanding its processes, people, and potential career paths within the team. Your journey will involve a structured orientation, classroom training on products and business, on-the-job learning, technical seminars, mentorship, and engagement with senior management. As a GIA Intern, you will focus on data analytics in internal auditing, collaborating closely with the Internal Audit team to evaluate and enhance internal controls, mitigate risks, and contribute to process enhancements. Your responsibilities will include utilizing data analytics techniques to identify risks, drive innovation, design and implement data models using programming languages like Python, SQL, or Hive, and collaborate with stakeholders to leverage data effectively. Ideal candidates for this role are penultimate year students with a passion for teamwork, strong academic performance, extracurricular involvement, project management skills, and technical proficiency. You should possess experience in data analysis, data modelling, and problem-solving abilities. Eligible candidates must have the legal right to work in the country of application between July 2025 and May 2026. Standard Chartered is a global bank dedicated to making a positive impact for clients, communities, and employees. With a focus on diversity, inclusion, and innovation, we are committed to driving commerce and prosperity through our unique values. Join us in challenging the status quo, embracing opportunities for growth, and working towards a meaningful career that aligns with our purpose and values. In addition to a supportive work environment, we offer competitive benefits including retirement savings, medical and life insurance, flexible leave options, wellbeing support programmes, continuous learning opportunities, and a culture that values diversity and inclusion. If you are looking for a purpose-driven career in a bank that celebrates individual talents and fosters a culture of respect and growth, we encourage you to explore opportunities at Standard Chartered.,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |