Jobs
Interviews

1052 Etl Processes Jobs - Page 10

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 10.0 years

6 - 11 Lacs

nagpur, maharashtra, india

On-site

Key Responsibilities: Collaborate with clients to understand their master data management needs and design effective Profisee-based solutions. Install, configure, and optimize Profisee software within client environments to ensure smooth deployments. Develop workflows, data mappings, and integrations to support efficient data management practices. Conduct thorough system testing, troubleshoot issues, and perform performance tuning to ensure solution stability. Provide technical training and create documentation to help clients effectively use the Profisee platform. Assist with system upgrades and ongoing maintenance to maintain consistent functionality. Partner with business analysts, developers, and architects to align technical solutions with business objectives. Required Skills and Qualifications: Bachelor's degree in Computer Science, Information Technology, or related field. Proficiency in Profisee or similar master data management platforms. Strong understanding of database systems, data modeling, and ETL processes. Programming skills in SQL, Python, C#, or similar languages. Excellent problem-solving abilities and capability to work independently or collaboratively. Effective communication skills for client and stakeholder interaction. Preferred Qualifications: Prior experience implementing Profisee solutions for enterprise clients. Certification in Profisee or other relevant MDM tools. Familiarity with cloud platforms such as Azure or AWS.

Posted 2 weeks ago

Apply

5.0 - 7.0 years

6 - 11 Lacs

surat, gujarat, india

On-site

Role Responsibilities: Ontology Development: Design and implement ontologies based on BFO (Basic Formal Ontology) and CCO (Common Core Ontology) principles, ensuring alignment with business needs and industry standards. Collaborate with domain experts to capture and formalize domain knowledge into structured ontologies. Develop and maintain comprehensive ontologies to represent business entities, relationships, and processes. Data Modeling: Design semantic and syntactic data models adhering to ontological principles. Create scalable, flexible, and adaptable data models that meet evolving business requirements. Integrate data models with existing data infrastructures and applications. Knowledge Graph Implementation: Design and build knowledge graphs leveraging ontologies and data models. Develop algorithms and tools for knowledge graph population, enrichment, and ongoing maintenance. Utilize knowledge graphs for advanced analytics, search, and recommendation systems. Data Quality and Governance: Ensure accuracy, quality, and consistency of ontologies, data models, and knowledge graphs. Define and implement governance processes and standards for ontology and knowledge graph maintenance. Collaboration and Communication: Work closely with data scientists, software engineers, and business stakeholders to understand data requirements and provide tailored solutions. Communicate complex technical concepts clearly and effectively across diverse audiences. Qualifications: Education: Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience: 5+ years in data engineering or related roles. Proven experience with ontology development using BFO, CCO, or similar frameworks. Strong knowledge of semantic web technologies: RDF, OWL, SPARQL, SHACL. Proficiency in Python, SQL, and other relevant programming languages. Experience with graph databases (TigerGraph, JanusGraph) and triple stores (GraphDB, Stardog) is a plus. Desired Skills: Familiarity with machine learning and natural language processing (NLP) techniques. Experience with cloud platforms such as AWS, Azure, or GCP. Knowledge of Databricks technologies: Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, Photon. Strong analytical and problem-solving skills. Excellent communication and interpersonal skills.

Posted 2 weeks ago

Apply

7.0 - 10.0 years

6 - 14 Lacs

kolkata, west bengal, india

On-site

Key Responsibilities: Dataiku Leadership: Lead data engineering initiatives focusing on leveraging Dataiku's capabilities for data preparation, analysis, visualization, and deploying data-driven solutions. Data Pipeline Development: Design, develop, and optimize scalable and robust data pipelines to support business intelligence and advanced analytics projects, including automation of ETL/ELT processes from diverse data sources. Data Modeling & Architecture: Apply best practices in data modeling (dimensional, Kimball, Inmon) to create efficient, scalable database architectures ensuring data integrity and performance. ETL/ELT Expertise: Implement, manage, and optimize ETL/ELT workflows using various tools to maintain reliable, high-quality data flow and accessibility. Gen AI Integration: Explore and implement solutions using LLM Mesh or similar frameworks to integrate Generative AI capabilities into data engineering processes. Programming & Scripting: Use Python and SQL extensively for data manipulation, automation, and development of custom data solutions. Cloud Platform Deployment: Deploy and manage scalable data solutions on AWS or Azure cloud platforms, leveraging cloud services for performance and cost efficiency. Data Quality & Governance: Ensure integration of data sources maintains high-quality, consistent, and accessible data; implement and follow data governance best practices. Collaboration & Mentorship: Work closely with data scientists, analysts, and other stakeholders to translate data requirements into effective solutions; mentor junior team members when needed. Performance Optimization: Monitor and optimize data pipeline and system performance continuously to meet business needs. Required Skills & Experience: Proficiency in Dataiku for data prep, visualization, and building end-to-end data pipelines and applications. Strong expertise in data modeling techniques such as dimensional modeling (Kimball, Inmon). Extensive experience with ETL/ELT tools and processes (e.g., Dataiku built-in tools, Apache Airflow, Talend, SSIS). Familiarity with LLM Mesh or similar Generative AI frameworks. Advanced skills in Python programming and SQL querying for data manipulation and automation. Hands-on experience with cloud platforms like AWS or Azure for scalable data deployments. Understanding of Generative AI concepts and potential applications. Excellent analytical, problem-solving, communication, and interpersonal skills. Bonus Skills (Nice to Have): Experience with big data technologies such as Spark, Hadoop, Snowflake. Knowledge of data governance and security best practices. Familiarity with MLOps principles and tools. Contributions to open-source projects in data engineering or AI. Education: Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related quantitative field.

Posted 2 weeks ago

Apply

10.0 - 12.0 years

6 - 14 Lacs

jaipur, rajasthan, india

Remote

Role Responsibilities: Collaborate with stakeholders to gather and understand reporting requirements and translate them into interactive and insightful visualizations. Design and develop Power BI reports and dashboards that deliver actionable business insights. Create detailed wireframes and prototypes using Figma to effectively communicate UI/UX design concepts. Apply best practices for data visualization to ensure reports are user-friendly and intuitive. Develop and maintain robust data models in Power BI to support analytical needs. Analyze data to identify trends and patterns that inform business decision-making. Provide training and ongoing support to end-users on dashboard functionalities. Work cross-functionally to gather feedback and continuously improve reporting solutions. Test and validate data accuracy and integrity within reports and dashboards. Implement data governance best practices to ensure data security and compliance. Stay current with the latest Power BI features and UI/UX design trends to enhance reporting quality. Assist in project management to ensure timely delivery of Power BI projects. Document report development processes and create user guides. Support ad-hoc reporting requests from stakeholders as required. Mentor junior team members and promote knowledge sharing to foster a collaborative environment. Qualifications: Bachelor's degree in Computer Science, Data Science, or related field. Minimum 10 years of experience in Power BI consulting and data visualization. Proficiency in Figma for wireframing and UI/UX design. Strong grasp of wireframing principles and design thinking methodologies. Hands-on experience in data analysis and data modeling. Excellent problem-solving skills with attention to detail. Strong communication skills with the ability to engage and collaborate with stakeholders. Experience working in Agile project environments. Ability to manage multiple projects and deadlines effectively. Proficient in SQL and other data querying languages. Familiarity with DAX and Power Query for Power BI report development. Knowledge of data governance practices and compliance. Skilled in delivering effective user training and support. Solid understanding of business intelligence tools and methodologies. Self-motivated with the ability to work independently in a remote setting.

Posted 2 weeks ago

Apply

7.0 - 10.0 years

6 - 14 Lacs

lucknow, uttar pradesh, india

On-site

Key Responsibilities: Dataiku Leadership: Lead data engineering initiatives focusing on leveraging Dataiku's capabilities for data preparation, analysis, visualization, and deploying data-driven solutions. Data Pipeline Development: Design, develop, and optimize scalable and robust data pipelines to support business intelligence and advanced analytics projects, including automation of ETL/ELT processes from diverse data sources. Data Modeling & Architecture: Apply best practices in data modeling (dimensional, Kimball, Inmon) to create efficient, scalable database architectures ensuring data integrity and performance. ETL/ELT Expertise: Implement, manage, and optimize ETL/ELT workflows using various tools to maintain reliable, high-quality data flow and accessibility. Gen AI Integration: Explore and implement solutions using LLM Mesh or similar frameworks to integrate Generative AI capabilities into data engineering processes. Programming & Scripting: Use Python and SQL extensively for data manipulation, automation, and development of custom data solutions. Cloud Platform Deployment: Deploy and manage scalable data solutions on AWS or Azure cloud platforms, leveraging cloud services for performance and cost efficiency. Data Quality & Governance: Ensure integration of data sources maintains high-quality, consistent, and accessible data; implement and follow data governance best practices. Collaboration & Mentorship: Work closely with data scientists, analysts, and other stakeholders to translate data requirements into effective solutions; mentor junior team members when needed. Performance Optimization: Monitor and optimize data pipeline and system performance continuously to meet business needs. Required Skills & Experience: Proficiency in Dataiku for data prep, visualization, and building end-to-end data pipelines and applications. Strong expertise in data modeling techniques such as dimensional modeling (Kimball, Inmon). Extensive experience with ETL/ELT tools and processes (e.g., Dataiku built-in tools, Apache Airflow, Talend, SSIS). Familiarity with LLM Mesh or similar Generative AI frameworks. Advanced skills in Python programming and SQL querying for data manipulation and automation. Hands-on experience with cloud platforms like AWS or Azure for scalable data deployments. Understanding of Generative AI concepts and potential applications. Excellent analytical, problem-solving, communication, and interpersonal skills. Bonus Skills (Nice to Have): Experience with big data technologies such as Spark, Hadoop, Snowflake. Knowledge of data governance and security best practices. Familiarity with MLOps principles and tools. Contributions to open-source projects in data engineering or AI. Education: Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related quantitative field.

Posted 2 weeks ago

Apply

8.0 - 10.0 years

6 - 14 Lacs

chennai, tamil nadu, india

Remote

Role Responsibilities: Design and develop scalable data pipelines using MS Fabric to support business intelligence and analytics needs. Build and optimize data models that facilitate effective data storage and retrieval. Manage ETL (Extract, Transform, Load) processes ensuring efficient data extraction, transformation, and loading. Collaborate with cross-functional teams to gather and define comprehensive data requirements. Ensure data quality, integrity, and consistency across all data processes. Implement and enforce best practices for data management, storage, and processing. Conduct performance tuning for data storage systems and query execution to enhance efficiency. Create and maintain detailed documentation for data architecture, workflows, and processes. Troubleshoot data-related issues and implement timely and effective solutions. Monitor and optimize cloud-based data solutions for scalability and resource efficiency. Research and evaluate emerging data engineering tools and technologies for project incorporation. Assist in designing and enforcing data governance frameworks and policies. Provide technical guidance and mentorship to junior data engineers. Participate in code reviews to ensure adherence to coding standards and quality. Stay updated on industry trends and best practices in data engineering and analytics. Qualifications: Minimum of 8 years of experience in data engineering or related roles. Strong expertise and hands-on experience with MS Fabric and its ecosystem. Proficiency in SQL and experience working with relational database management systems. Solid experience in data warehousing solutions and data modeling techniques. Hands-on experience with ETL tools and data integration processes. Familiarity with major cloud computing platforms such as Azure, AWS, and GCP. Working knowledge of Python or other programming languages commonly used in data engineering. Proven ability to communicate complex technical concepts to non-technical stakeholders clearly. Experience implementing data quality measures and data governance practices. Excellent problem-solving skills and a keen attention to detail. Ability to work independently in remote and distributed team environments. Experience with data visualization tools is advantageous. Strong analytical and organizational skills. Bachelor's degree in Computer Science, Engineering, or a related discipline. Familiarity with Agile methodologies and project management practices.

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Senior Software Engineer, Salesforce at Autodesk, you will be responsible for developing and leading the design and architecture of Salesforce solutions. You will play a crucial role in designing, coding, testing, and implementing Salesforce solutions that align with the business needs. Join our team of talented engineers in a hyper-growth phase, working on an expanded set of features and collaborating within and across teams to build best-of-breed solutions for a company committed to imagining, designing, and creating a better world. You will report to the Engineering Manager and work from our Bangalore location. Your responsibilities will include leading the design and architecture of Salesforce solutions, ensuring scalability, efficiency, and adherence to best practices. You will make key technical decisions, provide architectural guidance to the team, and write advanced code to implement complex custom functionalities using Salesforce technologies such as Apex, Lightning Components, and Lightning Web Components. Additionally, you will ensure code quality and maintainability, lead customization efforts, and ensure comprehensive documentation of complex solutions. Furthermore, you will lead complex integration projects, develop and manage advanced data solutions within Salesforce, including data modeling, ETL processes, and data migrations for complex scenarios. You will also develop advanced security measures, define and oversee advanced testing strategies, plan and lead complex deployments of Salesforce configurations and code, and mentor junior team members while taking responsibility for their deliverables. The minimum qualifications for this position include a Bachelor's degree in Computer Science, Information Technology, or a related field (preferred), Salesforce Developer certification, and at least 5 years of experience as a Salesforce Developer with a strong understanding of Salesforce capabilities and limitations. You should also have experience with the SFDC platform and its partner ecosystem, Sales Cloud, and sales processes, with experience working in a complex, integrated environment. If you are passionate about shaping the world and your future, and want to work in an environment that values diversity, belonging, and equitable workplace where everyone can thrive, consider joining Autodesk. We offer a competitive compensation package based on experience and geographic location, along with a comprehensive benefits package. Join us to be your whole, authentic self and do meaningful work that helps build a better future for all.,

Posted 2 weeks ago

Apply

8.0 - 10.0 years

0 Lacs

pune, maharashtra, india

On-site

Job Description : YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we're a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hireETL Testing Professionals in the following areas : Experience 8-10 Years Job Description ETL Test Lead Job Summary: We are seeking an experienced ETL Test Lead with strong expertise in data warehouse concepts, test strategy development, and team leadership. The ideal candidate will be responsible for leading ETL testing efforts, ensuring data integrity, and collaborating with cross-functional teams to deliver high-quality solutions. Key Responsibilities: . Lead ETL testing activities across multiple projects and ensure adherence to testing standards and best practices. . Develop and implement comprehensive test strategies, plans, and test cases for data warehouse and ETL processes. . Validate data transformations, data loads, and data quality across various stages of the ETL pipeline. . Collaborate with business analysts, developers, and stakeholders to understand data requirements and ensure accurate testing coverage. . Perform root cause analysis of data issues and provide recommendations for resolution. . Manage and mentor a team of testers, ensuring timely delivery and quality assurance. . Report testing progress, metrics, and issues to project stakeholders. . Ensure compliance with organizational and regulatory data standards. Required Skills & Qualifications: . Strong understanding of data warehouse concepts, ETL processes, and data modeling. -Experience with Snowflake. . Hands-on experience with ETL tools (e.g., Informatica, Talend, SSIS) and SQL for data validation. . Proven experience in designing and executing test strategies for large-scale data systems. . Excellent communication and stakeholder management skills. . Experience in leading testing teams and managing deliverables. . Familiarity with Agile methodologies and defect tracking tools (e.g., JIRA, HP ALM). . Bachelor's degree in Computer Science, Information Systems, or related field. Preferred Qualifications : . Experience with cloud data platforms (e.g., AWS, Azure, GCP). . Knowledge of automation frameworks for ETL testing. . ISTQB or equivalent testing certification. Required Technical/ Functional Competencies Domain/ Industry Knowledge: Basic knowledge of customer's business processes- relevant technology platform or product. Able to prepare process maps, workflows, business cases and simple business models in line with customer requirements with assistance from SME and apply industry standards/ practices in implementation with guidance from experienced team members. Requirement Gathering and Analysis: Working knowledge of requirement management processes and requirement analysis processes, tools & methodologies. Able to analyse the impact of change requested/ enhancement/ defect fix and identify dependencies or interrelationships among requirements & transition requirements for engagement. Product/ Technology Knowledge: Working knowledge of technology product/platform standards and specifications. Able to implement code or configure/customize products and provide inputs in design and architecture adhering to industry standards/ practices in implementation. Analyze various frameworks/tools, review the code and provide feedback on improvement opportunities. Architecture tools and frameworks: Working knowledge of architecture Industry tools & frameworks. Able to identify pros/ cons of available tools & frameworks in market and use those as per Customer requirement and explore new tools/ framework for implementation. Architecture concepts and principles : Working knowledge of architectural elements, SDLC, methodologies. Able to provides architectural design/ documentation at an application or function capability level and implement architectural patterns in solution & engagements and communicates architecture direction to the business. Analytics Solution Design: Knowledge of statistical & machine learning techniques like classification, linear regression modelling, clustering & decision trees. Able to identify the cause of errors and their potential solutions. Tools & Platform Knowledge: Familiar with wide range of mainstream commercial & open-source data science/analytics software tools, their constraints, advantages, disadvantages, and areas of application. Required Behavioral Competencies Accountability: Takes responsibility for and ensures accuracy of own work, as well as the work and deadlines of the team. Collaboration: Shares information within team, participates in team activities, asks questions to understand other points of view. Agility: Demonstrates readiness for change, asking questions and determining how changes could impact own work. Customer Focus: Identifies trends and patterns emerging from customer preferences and works towards customizing/ refining existing services to exceed customer needs and expectations. Communication: Targets communications for the appropriate audience, clearly articulating and presenting his/her position or decision. Drives Results: Sets realistic stretch goals for self & others to achieve and exceed defined goals/targets. Resolves Conflict: Displays sensitivity in interactions and strives to understand others views and concerns. Certifications Mandatory At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment.We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

We are looking for a highly experienced and visionary Data Analyst Architect to lead our data analytics team. You should possess a deep understanding of data systems, extensive experience in analytics architecture, and proven leadership skills. Your role will involve designing and optimizing data solutions, providing technical guidance, and managing a team to deliver impactful data-driven projects. Your responsibilities will include designing scalable and efficient data architectures to support analytics and business intelligence needs. You will establish and maintain data governance standards to ensure data quality, security, and compliance. Additionally, you will lead the development of actionable insights from complex datasets, collaborate with stakeholders to translate business goals into data-driven solutions, and manage, mentor, and inspire a team of data analysts and engineers. In terms of technology and tools, you should stay updated with emerging trends in data analytics and architecture. You will guide the adoption of modern tools such as Power BI, Tableau, Apache Spark, Snowflake, AWS, and Azure to enhance analytics capabilities. Project management skills are essential as you oversee end-to-end project lifecycles, identify risks, and develop mitigation strategies to ensure timely and quality deliverables. Collaboration with cross-functional teams to align analytics architecture with organizational strategies is key. You will present findings and recommendations to executives and other stakeholders. The ideal candidate will have a Bachelor's or Master's degree in Computer Science, Data Science, Information Systems, or a related field with a minimum of 10+ years of experience in data analytics, architecture, and team leadership. Technical skills required include expertise in data modeling, ETL processes, database design, proficiency in SQL, Python, R, and other analytical tools, experience with big data technologies such as Hadoop, Spark, and Kafka, and a strong understanding of data visualization platforms like Tableau, Power BI, and Looker. Familiarity with cloud platforms and services like AWS, Azure, GCP, machine learning frameworks, and predictive analytics is also desired. Leadership skills, problem-solving abilities, and certifications in data analytics or cloud platforms are preferred qualifications. Experience in implementing data governance frameworks, knowledge of agile methodologies and tools like Jira and Confluence, and contributions to open-source projects or a portfolio of analytics solutions will be advantageous.,

Posted 2 weeks ago

Apply

6.0 - 8.0 years

0 Lacs

pune, maharashtra, india

On-site

Operations Engineer, Associate Position Overview Job Title- Operations Engineer, Associate Location- Pune, India Role Description Responsible for the day-to-day maintenance of the application systems in operation, including tasks related to identifying and troubleshooting application issues and issues resolution or escalation. Responsibilities also include root cause analysis, management communication and client relationship management in partnership with Infrastructure Service Support team members. Ensures all production changes are made in accordance with life-cycle methodology and risk guidelines. Responsible for coaching and mentoring less experienced team members and or acting as a subject matter expert. In depth Functional knowledge of the application(s) supported and interdependencies Is an experienced and detail-oriented person capable of integrating product knowledge, research and testing to answer complex questions about product behavior and provide end to end solution to permanently fix the issue. The engineer will assist customer teams and other team members to understand how customers can achieve desired outcomes using the applications it exists today. The output of could range from FAQs and knowledge base articles that describe to customers how to operate the product to achieve selected outcomes to end to end coding solution for the issue reported. The engineer would be liaising with the global stakeholders and vendors to deliver technology solutions as part of yearly book of work The engineer should also be able understand functional requirements / expectations of the various stakeholders and work towards an appropriate plan of action. The role also requires working with the product vendors and lead upgrades as applicable. What we'll offer you As part of our flexible scheme, here are just some of the benefits that you'll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Researching, designing, implementing and managing software programs Testing and evaluating new programs Identifying areas for modification in existing programs and subsequently developing these modifications Oversee resolution of technical issues coming from customer teams Fix and deliver the customer issues Follow ITIL processes including incident management, change management, release management, problem management and knowledge management Strong problem solving skills with good communication skills, ability to work under pressure with a high sense of urgency. Proactively identify potential incidents and problems as well as availability issues Manage any IT Security incidents that may occur in the application. Identify risk & issues and contribute in Service Management related audits. Perform environment maintenance and management Deploying software tools, processes and metrics Perform standard recurring activities like data and environment refreshes Be a liaison between the customer-facing teams and the Product and Engineering org for management and resolution of all technical questions and issues Work closely with other developers, business and systems analysts Maintain detailed documentation ranging from Knowledge Base articles to live logging of incidents for post-mortems Ensure delivery timelines and SLA obligations established with with internal and external stakeholders are observed and met escalate as necessary using judgment and discretion Develop a deep understanding of the application platform across all product lines and clearly articulate support decisions and findings Work closely with internal teams to stay up to date on product features, changes, and issues Your skills and experience Must be having total 6+ years of experience and at least 5 years in software development/support engineering Must have advanced knowledge of Java / C# / .Net debugging & scripting (Power shell / Unix / any other) Must have advanced knowledge of MS SQL Sever, SSIS, Tableau and ETL processes Working Knowledge of SDLC & Agile processes Demonstrable experience in leading projects to successful conclusions Strong customer focus with experience of working with cross-functional/ cross-department teams A self-starter with strong organization skills, resolution management, and superior written and verbal communication skills Educational/Qualifications: B.E. / B. Tech. / Master's degree in computer science or equivalent ITIL Certification is good to have How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About us and our teams Please visit our company website for further information: We strive for a in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

hyderabad, telangana

On-site

Your opportunity to make a real impact and shape the future of financial services is waiting for you. Let's push the boundaries of what's possible together. As a Senior Director of Software Engineering at JPMorgan Chase within the Consumer and Community Banking division, you will be responsible for leading various technical domains, overseeing the activities of multiple departments, and fostering cross-functional collaboration. Your technical expertise will be utilized across different teams to promote the adoption and implementation of advanced technical methods, helping the firm stay ahead of industry trends, best practices, and technological advancements. Leads multiple technology and process implementations across departments to achieve firmwide technology objectives. Directly manages multiple areas with strategic transactional focus. Provides leadership and high-level direction to teams while frequently overseeing employee populations across multiple platforms, divisions, and lines of business. Acts as the primary interface with senior leaders, stakeholders, and executives, driving consensus across competing objectives. Manages multiple stakeholders, complex projects, and large cross-product collaborations. Influences peer leaders and senior stakeholders across the business, product, and technology teams. Champions the firm's culture of diversity, equity, inclusion, and respect. Required qualifications, capabilities, and skills include formal training or certification on data management concepts and 10+ years applied experience. In addition, 5+ years of experience leading technologists to manage, anticipate and solve complex technical items within your domain of expertise. Proven experience in designing and developing large scale data pipelines for batch & stream processing. Strong understanding of Data Warehousing, Data Lake, ETL processes and Big Data technologies (e.g Hadoop, Snowflake, Databricks, Apache Spark, PySpark, Airflow, Apache Kafka, Java, Open File & Table Formats, GIT, CI/CD pipelines etc.). Expertise with public cloud platforms (e.g., AWS, Azure, GCP) and modern data processing & engineering tools. Excellent communication, presentation, and interpersonal skills. Experience developing or leading large or cross-functional teams of technologists. Demonstrated prior experience influencing across highly matrixed, complex organizations and delivering value at scale. Experience leading complex projects supporting system design, testing, and operational stability. Experience with hiring, developing, and recognizing talent. Extensive practical cloud native experience. Expertise in Computer Science, Computer Engineering, Mathematics, or a related technical field. Preferred qualifications, capabilities, and skills include experience working at code level and ability to be hands-on performing PoCs, code reviews. Experience in Data Modeling (ability to design Conceptual, Logical and Physical Models, ERDs and proficiency in data modeling software like ERwin). Experience with Data Governance, Data Privacy & Subject Rights, Data Quality & Data Security practices. Strong understanding of Data Validation / Data Quality. Experience with supporting large scale AI/ML Data requirements. Experience in Data visualization & BI tools is a huge plus.,

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

jaipur, rajasthan

On-site

As a Senior BI Engineer at Hydro Global Business Services (GBS), you will have a crucial role in enabling data-driven decision-making by designing, developing, and maintaining robust BI solutions. You will collaborate with various business stakeholders to gather BI requirements and develop optimized reports, dashboards, and visualizations that provide actionable insights. Additionally, you will be responsible for building and maintaining ETL/ELT processes to support the transition to modern cloud platforms and ensuring data quality, consistency, and availability. Your main responsibilities will include designing, developing, and maintaining BI solutions using Snowflake Data Cloud and related technologies, analyzing business requirements to create scalable BI solutions, and collaborating with Data Engineers to ensure robust data pipelines and warehousing solutions. You will also play a key role in interpreting business data, uncovering trends, and presenting findings through effective storytelling and visualizations. To be successful in this role, you should have at least 7 years of hands-on experience in Business Intelligence development with a focus on data visualization and dashboarding, particularly using Power BI. A strong business analyst mindset, proficiency in SQL-based databases, and familiarity with cloud-based data warehouse platforms like Snowflake are highly desirable. Exposure to the manufacturing domain is considered advantageous. In terms of education and skills, you should hold a Bachelor's degree in Computer Science, Information Systems, Business Administration, or a related field. Fluency in English, strong analytical and problem-solving skills, excellent communication abilities, and the capacity to work independently and as part of a team in a dynamic environment are essential. Your technical skills should include expertise in SQL, ETL processes, data modeling, and experience with Tabular models and BI solution architectures. Leadership qualities, analytical competence, and soft skills such as effective communication, collaboration, adaptability, and a commitment to quality and deadlines are also key attributes for this role. Working at Hydro GBS offers you the opportunity to be part of a global team in a flexible work environment, collaborating with experts in the field. You will have the chance to grow with the company, gain new certificates, and benefit from an attractive package. If you meet the requirements and are interested in this position, please apply by uploading your CV and optionally a cover letter through our online system. Recruiter: Lima Mathew, Sr. HR Advisor People Resourcing,

Posted 2 weeks ago

Apply

10.0 - 12.0 years

0 Lacs

mumbai, maharashtra, india

On-site

Job Description: About us. At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities and shareholders every day. One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We're devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being. Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization. Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us! Global Business Services Global Business Services delivers Technology and Operations capabilities to Lines of Business and Staff Support Functions of Bank of America through a centrally managed, globally integrated delivery model and globally resilient operations. Global Business Services is recognized for flawless execution, sound risk management, operational resiliency, operational excellence and innovation. In India, we are present in five locations and operate as BA Continuum India Private Limited (BACI), a non-banking subsidiary of Bank of America Corporation and the operating company for India operations of Global Business Services. Process Overview. GF (Global Finance) Global Financial Control India (GFCI) is part of the CFO Global Delivery strategy to provide offshore delivery to Line of Business (LOBs) and Enterprise Finance functions. The capabilities hosted include General Accounting & Reconciliations, Legal Entity Controllership, Corporate Sustainability Controllership, Corporate Controllership, Management Reporting & Analysis, Finance Systems Support, Operational Risk and Controls, Regulatory Reporting and Strategic initiatives. Job Description. The Finance Emissions Accounting & Reporting team, a part of the Global Financial Control-Corporate Sustainability Controller organization within the CFO Group, plays a critical role in supporting the calculation of asset level balance sheet Financed Emissions, which are integral to the Bank s goal of achieving Net-zero greenhouse gas emissions by 2050 The role is responsible for building data sourcing process, data research and analytics using available tools, support model input data monitoring and develop necessary data or reporting frameworks to support our approaches to net zero progress alignment, target setting, client engagement and reputational risk review, empowering banking teams to assist clients on net zero financing strategies and specific commercial opportunities. The role will support and partner with business stakeholders in the Enterprise Climate Program Office, Technology, Climate and Credit Risk, the Global Environment Group, Lines of Business, Legal Entity Controllers and Model Risk Management. Additionally, the role will support data governance, lineage, controls by building, improving and executing data processes. Responsibilities Net zero transition planning and execution: Partners with GEG, Program Office and Lines of Business in developing and executing enterprise-wide net zero transition plan and operational roadmap, with a focus on analysis and reporting capabilities, data procurement, liaising with consultants, external data providers, Climate Risk and Technology functions. Data development & Operations: Research on data requirements, produce executive level and detailed level data summary, validate the accuracy, completeness, reasonableness, timeliness on the dataset and develop desktop procedures for BAU operations. Perform data review and test technology implementation for financed emissions deliverables. Execute BAU processes such as new data cycle creation, execute data controls and data quality processes. Produce data summary materials and walk through with leadership team. Data Analytics & Strategy: Analyze the data and provide how granular data movements across history affects the data new results. Find trends of data improvements or areas for improvement. Develops automated data analysis results and answer the common questions to justify the changes in data. Support ad hoc analytics of bank-wide and client net zero commitment implementation, with an initial focus on automation of financed emissions analysis, reporting against PCAF standards and net zero transition preparedness analytics and engagement to enhance strategy for meeting emissions goals for target sectors. Requirements. Education. Bachelor's degree in data management or analytics, engineering, sustainability, finance or other related field OR Master's degree in data science, earth/climate sciences, engineering, sustainability, natural resource management, environmental economics, finance or other related field Certification if any NA Experience Range. Minimum 10+ years in Climate, Fuel Emissions, finance, financial reporting Three (3) or more years of experience in statistical and/or data management and analytics and visualization (intersection with financial services strongly preferred) Foundational Skills. Deep expertise in SQL, Excel, Python, automation & optimization and project management Knowledge on Alteryx, Tableau, R, (knowledge of NLP, data scraping and generative AI welcome) Knowledge of data architecture concepts, data models, ETL processes Experience in extracting, and combining data across from multiple sources, and aggregate data to support model development Understanding of corporate accounting data integrity considerations and risk management Practical experience with supporting data procurement and analytics of leading ESG reporting frameworks, specifically TCFD and PCAF and experience with climate-related data providers and sources (S&P, MSCI, MJB etc) Strong technical and visualization skills, with the ability to understand the business goals, needs, and be committed to delivering recommendations that will guide strategic decisions. Experience in multiple database environment such as Oracle, Hadoop, and Teradata Strong leadership skills and proven ability in motivating employees and promoting teamwork. Deep understanding of how data process works and have the ability to solve dynamically evolving and complex data challenges part of day-to-day activities. Strong documentation & presentation skills to explain the data analysis in a visual and procedural way based on the audience. Excellent interpersonal, management, and teamwork skills. Highly motivated self-starter with excellent time management skills and the ability to effectively manage multiple priorities and timelines. High level of significant independent decision-making ability. Ability to quickly identify risks and determine reasonable solutions. Demonstrated ability to motivate others in a high-stress environment to achieve goal. Desired Skills. Advanced knowledge of Finance Advanced knowledge of Climate Risk Work Timings. Window 11:30 AM to 8:30 PM (9 hours shift, may require stretch during close period) Job Location. Mumbai

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

As an Adobe Campaign Consultant, your primary responsibility will be to lead client engagements by understanding their technical and business requirements. You will conduct workshops, present solution insights, and analyze the benefits of various software features and developments. Your role will involve providing hands-on consultancy for Adobe Campaign setup, post-implementation support, and developing client-specific solutions. You will also be responsible for evaluating client requirements, designing and configuring Adobe Campaign workflows, customizing campaign configurations, troubleshooting technical issues, and developing implementation strategies. In addition to the technical aspects, you will prepare detailed estimates and proposals for consulting engagements, maintain accurate documentation, conduct knowledge transfer sessions and training for client teams, manage product upgrades, and provide project status updates to management. Collaboration with the onshore consulting team and adherence to high-quality technical delivery standards will be crucial in this role. To excel in this position, you must possess strong proficiency in SQL development and ETL processes, experience with scripting or programming, and a proven track record in implementing Adobe Campaign. Excellent communication skills, the ability to work independently under pressure, and willingness to work across different time zones are essential. You should also have strong presentation skills, as well as experience managing multiple clients and expectations simultaneously. Preferred qualifications include experience handling multiple projects and system integrations, familiarity with Adobe Experience Cloud applications, advanced SQL knowledge, working knowledge of XML and DOM-related technologies, and exposure to dynamic web development and personalization strategies. This is a full-time position with the benefit of working from home and following a US shift schedule.,

Posted 2 weeks ago

Apply

3.0 - 5.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Engineering Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the data architecture. You will be actively involved in problem-solving and contributing to the overall success of the data platform initiatives, ensuring that all components work seamlessly together to meet organizational goals. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Engage in continuous learning to stay updated with industry trends and best practices. - Assist in the documentation of data architecture and integration processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Engineering. - Strong understanding of data modeling techniques and best practices. - Experience with ETL processes and data pipeline development. - Familiarity with cloud-based data platforms and services. - Ability to work with various data storage solutions, including relational and non-relational databases. Additional Information: - The candidate should have minimum 3 years of experience in Data Engineering. - This position is based at our Bengaluru office. - A 15 years full time education is required. Show more Show less

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

NTT DATA is seeking a Celonis Analyst/Process Mining Consultant to join the team in Pune, Maharashtra, India. As a Celonis Analyst with 5+ years of experience, you will leverage Celonis EMS to analyze business processes, identify inefficiencies, and drive automation. You will collaborate with stakeholders to map processes, build dashboards, and provide insights for process transformation initiatives. Key Responsibilities: - Design and develop process mining analyses using Celonis EMS to identify inefficiencies and improvement opportunities. - Collaborate with stakeholders to map current processes. - Build dashboards, KPIs, and custom analyses using Celonis PQL. - Support process transformation initiatives with insights from Celonis analyses. - Work with data engineers to extract, transform, and load relevant data into Celonis. - Continuously monitor, maintain, and improve Celonis data models and visualizations. Required Skills: - 5+ years of experience in data analytics or process improvement, with specific experience in Celonis EMS. - Strong knowledge of process mining principles and methodologies. - Proficiency in PQL and Celonis IBC/EMS. - Experience integrating Celonis with ERP systems like SAP, Oracle, or ServiceNow. - Solid SQL skills and understanding of data modeling and ETL processes. - Strong analytical, communication, and problem-solving skills. Good to Have: - Knowledge of Python or R for extended analytics capabilities. - Familiarity with BPMN or process mapping tools. - Experience in Six Sigma, Lean, or other process improvement methodologies. - Celonis Analyst or Implementation certification. Education: - Bachelor's degree in Computer Science, Engineering, Business Analytics, or related field. About NTT DATA: NTT DATA is a global innovator of business and technology services, serving 75% of the Fortune Global 100. With diverse experts in over 50 countries and a robust partner ecosystem, NTT DATA offers business and technology consulting, data and artificial intelligence, industry solutions, and more. As a leading provider of digital and AI infrastructure, NTT DATA is committed to helping clients innovate, optimize, and transform for long-term success. Visit us at us.nttdata.com.,

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

haryana

On-site

As an Expert Engineer in the GPS Technology department located in Gurugram, India, reporting to the Project Manager, you will be an integral part of the Technology function providing IT services to Fidelity International business globally. Your role involves developing and supporting business applications that are crucial for revenue, operational efficiency, compliance, finance, legal, customer service, and marketing functions. Your primary responsibility will be to understand system requirements, analyze, design, develop, and test application systems following defined standards. Your expertise in software designing, programming, engineering, and problem-solving skills will be crucial in delivering value to the business efficiently and with high quality. Your essential skills will include working on Data Ingestion, Transformation and Distribution using AWS or Snowflake, experience with SnowSQL, Snowpipe, ETL/ELT tools, and hands-on knowledge of AWS services like EC2, Lambda, ECS/EKS, DynamoDB, VPCs. You will be familiar with building data pipelines leveraging Snowflake's capabilities and integrating technologies that work with Snowflake. Moreover, you will design Data Ingestion and Orchestration Pipelines using AWS and Control M, establish strategies for data extraction, ingestion, transformation, automation, and consumption, and ensure data quality and code coverage. Your ability to experiment with new technologies, passion for technology, problem-solving, and effective collaboration skills will be essential for success in this role. To qualify for this position, you should hold a B.E./B.Tech. or M.C.A. in Computer Science from a reputed University with a total of 7 to 10 years of relevant experience. Personal characteristics such as good interpersonal and communication skills, being a strong team player, strategic thinking, self-motivation, and problem-solving abilities will be highly valued. Join us in our mission to build better financial futures for our clients and be a part of a team that values your well-being, supports your development, and offers a flexible work environment. Visit careers.fidelityinternational.com to explore more about our work culture and how you can contribute to our team.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

You should hold a Bachelor's degree in Computer Science, Information Systems, or a related field. With 2+ years of experience in Power BI development, you should possess a strong understanding of data warehousing, data modeling (star/snowflake schema), and ETL processes. Your experience with SQL and relational databases (e.g., SQL Server, Oracle) will be crucial in this role. Excellent analytical and problem-solving skills are necessary, along with the ability to work independently and as part of a team.,

Posted 2 weeks ago

Apply

3.0 - 5.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Engineering Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the data architecture. You will be actively involved in problem-solving and contributing to the overall success of the data platform initiatives, ensuring that all components work seamlessly together to meet organizational goals. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Engage in continuous learning to stay updated with industry trends and technologies. - Assist in the documentation of data architecture and integration processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Engineering. - Strong understanding of data modeling techniques and best practices. - Experience with ETL processes and data pipeline development. - Familiarity with cloud-based data platforms and services. - Knowledge of data governance and data quality principles. Additional Information: - The candidate should have minimum 3 years of experience in Data Engineering. - This position is based at our Bengaluru office. - A 15 years full time education is required. Show more Show less

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Position Summary The Group Investment Management (GIM) Team is hiring for a Senior Manager Product Tooling & Data (Data Focused) to work in the Product Tooling and Data Team. Your ability to interpret and use data science and analytics to deliver your work and support the team is essential. With respect to pro-actively measure and predict programme execution risk, using machine learning and AI to develop early warning risk indicators helping steer senior leadership management attention to the parts of the portfolio where delivery of planned benefits and outcomes is at risk of not being met or likely to require substantially more financial dedication than originally asked. The predictive risk metrics you will develop will allow timely management intervention, avoiding costly last minute write off and help foster a delivery discipline and delivery transparency culture.? You will play a pivotal role in transforming complex programme data into predictive insights that drive strategic investment decisions. Youll take the lead on developing machine learning models and early warning indicators to proactively identify delivery risks across a multi-million-pound change portfolio. This is a unique opportunity to apply your data expertise in a high-impact, enterprise-level environment that values innovation, transparency, and delivery excellence.? Come join this team with multifaceted strengths of product owners and make an impact through your work! Proud to share LSEG in the India is Great Place to Work certified (Jun 25 Jun 26). Learn more about life and purpose of our company directly from India colleagues video: Bengaluru, India | Where We Work | LSEG Key Responsibilities ? Lead the design and implement machine learning models to predict programme delivery risks and financial overruns.? Help integrate the insight generated into existing dashboards and visualisations to communicate insights to senior leadership, in collaboration with existing MI teams. Ensure that risk metrics and insights are actionable and trackable.? Collaborate with senior stakeholders across the business to understand data needs and translate them into technical solutions.? Conduct data cleaning, preprocessing, and integration from multiple sources (internal databases, APIs).? Contribute to the development of the Group-wide delivery framework and assurance standards.? ?Stay current with advancements in AI/ML and apply them to improve risk prediction and decision-making.? Support the annual investment planning process with data-driven analysis and scenario ? Required Experience & Qualifications? Proficiency with Python or R, with experience in libraries such as scikit-learn, TensorFlow, or similar.? Experience of LLM prompt engineering for systematic feature extraction from unstructured data.? ?Strong SQL and data wrangling skills; experience with ETL processes.,? Solid understanding of statistical analysis and hypothesis testing.? Experience building and deploying machine learning models in a business context? Ability to communicate technical insights to non-technical audiences through visualisation tools (e.g., Tableau, matplotlib).?? Familiarity with fine tuning small LLM models to automate small topic extraction tasks from unstructured data.? Familiarity with cloud platforms (e.g., AWS), especially AWS Bedrock and Snowflake data platform.? Experience with enterprise programme management tools such as Clarity, Asana, and JIRA.? Knowledge of SAP ERP and Workday systems data.? Experience in financial services or programme delivery environments.? Strong communication and collaboration skills.? Self-starter with the ability to manage multiple priorities in a fast-paced environment.? Commitment to continuous learning and innovation.? LSEG is committed to encouraging a diverse, equitable, an inclusive work environment, ensuring equal opportunities for all employees, regardless of their background. We offer great employee benefits to make sure everyone performs to the best of their abilities. We offer a hybrid working model. LSEG is a leading global financial markets infrastructure and data provider. Our purpose is driving financial stability, empowering economies and enabling customers to create sustainable growth. Our purpose is the foundation on which our culture is built. Our values of Integrity, Partnership , Excellence and Change underpin our purpose and set the standard for everything we do, every day. They go to the heart of who we are and guide our decision making and everyday actions. Working with us means that you will be part of a dynamic organisation of 25,000 people across 65 countries. However, we will value your individuality and enable you to bring your true self to work so you can help enrich our diverse workforce. You will be part of a collaborative and creative culture where we encourage new ideas and are committed to sustainability across our global business. You will experience the critical role we have in helping to re-engineer the financial ecosystem to support and drive sustainable economic growth. Together, we are aiming to achieve this growth by accelerating the just transition to net zero, enabling growth of the green economy and creating inclusive economic opportunity. LSEG offers a range of tailored benefits and support, including healthcare, retirement planning, paid volunteering days and wellbeing initiatives. We are proud to be an equal opportunities employer. This means that we do not discriminate on the basis of anyones race, religion, colour, national origin, gender, sexual orientation, gender identity, gender expression, age, marital status, veteran status, pregnancy or disability, or any other basis protected under applicable law. Conforming with applicable law, we can reasonably accommodate applicants' and employees' religious practices and beliefs, as well as mental health or physical disability needs. Please take a moment to read this privacy notice carefully, as it describes what personal information London Stock Exchange Group (LSEG) (we) may hold about you, what its used for, and how its obtained, your rights and how to contact us as a data subject. If you are submitting as a Recruitment Agency Partner, it is essential and your responsibility to ensure that candidates applying to LSEG are aware of this privacy notice. Show more Show less

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

You have over 10 years of hands-on programming experience in the Java technology stack and have expertise in designing and developing RESTful web services. You possess in-depth knowledge of relational (MYSQL, Oracle, SQL Server) and non-relational data stores, with expert-level experience in writing SQL queries within relational databases. Your experience extends to working with noSQL databases and handling ETL processes to transform data between relational and non-relational formats. You are proficient in using Git (preferably Github), GitLab, or other modern code repositories, as well as automation/build tools like Maven, Jenkins, CI, and CloudFormation. Your commitment to achieving outcomes is unwavering as you leverage talent, tools, and solutions to address business challenges. You are capable of making tough decisions regarding talent, tools, and deliverables while effectively communicating the rationale to a broad audience. Your proactive communication with US leads ensures transparency by promptly sharing any challenges or setbacks that may arise. Your proven ability to lead and motivate teams, coupled with excellent communication and interpersonal skills, sets you apart. You have a solid understanding of product development lifecycles and are experienced with Agile and other product development methodologies. Your strong organizational and time management skills enable you to work both independently and collaboratively within a team. You possess a passion for technology and innovation, with additional skills of interest including experience in a fast-paced startup environment, familiarity with SaaS, PaaS products, startup culture, Jira, Azure DevOps, Continuous Integration, and other development tools. In this role, you will lead and mentor a team of talented product engineers, designers, and other professionals to design, develop, and maintain highly scalable systems using Java, React, SQL & NOSQL databases. You will ensure technical execution adheres to industry best practices, focusing on code quality, reducing technical debt, and enhancing software architecture, reliability, and performance. Collaborating closely with product managers, you will translate user stories and feature requests into technical requirements, ensuring a consistent approach to building, testing, and releasing all work. You will oversee the entire product development process from ideation to launch, ensuring products meet quality standards and customer expectations. Additionally, you will manage project schedules, budgets, and resources, conduct testing and analysis, troubleshoot complex issues, and communicate effectively with stakeholders. Your role will require you to stay updated on industry trends, contribute to technical discussions and decision-making, and maintain a strong technical background in relevant technologies and methodologies. You will work from the Trivandrum or Bangalore office locations in a full-time capacity, with benefits including Provident Fund, a day shift schedule, yearly bonus, and in-person work. Please note that only candidates available for interviews on weekdays with a maximum notice period of 15 days should apply.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

Job Description: As a Domo Specialist at our Bangalore office, you will play a crucial role in designing, developing, and maintaining data dashboards and reports to provide valuable business insights. With over 3 years of experience in Domo development and data visualization, you will leverage your expertise to create interactive dashboards, optimize data models, and ensure data accuracy and security within the Domo environment. Your ability to collaborate with stakeholders, extract, transform, and load data from various sources, and stay updated with the latest Domo features will be key to your success in this role. Key Responsibilities: - Develop interactive dashboards and reports using Domo. - Extract, transform, and load (ETL) data from various sources into Domo. - Collaborate with stakeholders to understand data requirements and translate them into actionable insights. - Optimize data models, reports, and visualizations for performance and usability. - Ensure data accuracy, consistency, and security within the Domo environment. - Provide support and troubleshooting for Domo-related issues. - Stay updated with the latest Domo features and best practices. Required Skills & Qualifications: - 3+ years of experience in Domo development and data visualization. - Strong proficiency in building dashboards, reports, and KPIs using Domo. - Experience with ETL processes and integrating data from multiple sources. - Good understanding of SQL, data modeling, and business intelligence concepts. - Ability to thrive in a fast-paced office environment and collaborate effectively with cross-functional teams. - Strong problem-solving and analytical skills.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You will be responsible for developing and managing Power BI solutions by designing, developing, and maintaining Power BI reports and dashboards from scratch. Additionally, you will integrate data from various sources into a data lake, ensuring data cleanliness, accuracy, and accessibility. Troubleshooting and resolving issues related to data integration and dashboard functionality will also be a part of your role. Moreover, you will create and maintain data models to support reporting and analytics needs, design and implement ETL processes to move data from source systems to the data lake and Power BI, and optimize Power BI reports and dashboards for performance and usability. Documentation of data models, ETL processes, and Power BI solutions will be essential to ensure maintainability and knowledge sharing. Your responsibilities will also include overseeing and supporting processes by reviewing daily transactions on performance parameters, reviewing performance dashboards and team scores, and supporting the team in improving performance parameters by providing technical support and process guidance. You will be required to record, track, and document all queries received, problem-solving steps taken, successful and unsuccessful resolutions, and ensure adherence to standard processes and procedures for resolving client queries within defined SLAs. In cases of technical escalations, you will be expected to provide effective diagnosis and troubleshooting of client queries, manage and resolve technical roadblocks or escalations as per SLA and quality requirements, and timely escalate unresolved issues to the appropriate channels. Offering product support and resolution to clients, troubleshooting queries in a user-friendly and professional manner, and providing alternative solutions when necessary to retain customer business will also be part of your responsibilities. Furthermore, you will be responsible for building people capability to ensure operational excellence and maintain superior customer service levels. This includes mentoring and guiding Production Specialists on improving technical knowledge, conducting trainings to bridge skill gaps, developing product-specific trainings for production specialists, and staying updated with product features, changes, and updates. Identifying common problems, recommending resolutions, participating in self-learning opportunities, and maintaining personal networks to update job knowledge will also be expected from you. Your performance will be measured based on key parameters such as the number of cases resolved per day, compliance to process and quality standards, meeting SLAs, productivity, efficiency, absenteeism, triages completed, technical test performance, and customer feedback. Mandatory skills include Power BI visualization on the cloud, and the required experience is 5-8 years. Join Wipro to reinvent your world and be part of an end-to-end digital transformation partner with bold ambitions. Realize your ambitions in an environment that encourages constant evolution and reinvention. Wipro welcomes applications from people with disabilities.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

haryana

On-site

YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we are a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Microsoft Fabric Professionals in the following areas: **Position**: Data Analytics Lead **Experience**: 8+ Years **Responsibilities**: - Build, manage, and foster a high-functioning team of data engineers and data analysts. - Collaborate with business and technical teams to capture and prioritize platform ingestion requirements. - Experience of working with the manufacturing industry in building a centralized data platform for self-service reporting. - Lead the data analytics team members, providing guidance, mentorship, and support to ensure their professional growth and success. - Responsible for managing customer, partner, and internal data on the cloud and on-premises. - Evaluate and understand current data technologies and trends and promote a culture of learning. - Build an end-to-end data strategy from collecting the requirements from business to modeling the data and building reports and dashboards. **Required Skills**: - Experience in data engineering and architecture, with a focus on developing scalable cloud solutions in Azure Synapse / Microsoft Fabric / Azure Databricks. - Accountable for the data group's activities including architecting, developing, and maintaining a centralized data platform including our operational data, data warehouse, data lake, Data factory pipelines, and data-related services. - Experience in designing and building operationally efficient pipelines, utilizing core Azure components, such as Azure Data Factory, Azure Databricks, and Pyspark, etc. - Strong understanding of data architecture, data modeling, and ETL processes. - Proficiency in SQL and Pyspark. - Strong knowledge of building PowerBI reports and dashboards. - Excellent communication skills. - Strong problem-solving and analytical skills. **Required Technical/Functional Competencies**: - Domain/Industry Knowledge - Requirement Gathering and Analysis - Product/Technology Knowledge - Architecture Tools and Frameworks - Architecture Concepts and Principles - Analytics Solution Design - Tools & Platform Knowledge **Accountability**: - Takes responsibility for and ensures the accuracy of own work, as well as the work and deadlines of the team. **Required Behavioral Competencies**: - Collaboration - Agility - Customer Focus - Communication - Drives Results - Resolves Conflict **Certifications**: - Mandatory At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles: flexible work arrangements, free spirit, and emotional positivity; agile self-determination, trust, transparency, and open collaboration; all support needed for the realization of business goals; stable employment with a great atmosphere and ethical corporate culture.,

Posted 2 weeks ago

Apply

9.0 - 13.0 years

0 Lacs

chennai, tamil nadu, india

On-site

Job Description: About Us At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities and shareholders every day. One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We're devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being. Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization. Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us! Global Business Services Global Business Services delivers Technology and Operations capabilities to Lines of Business and Staff Support Functions of Bank of America through a centrally managed, globally integrated delivery model and globally resilient operations. Global Business Services is recognized for flawless execution, sound risk management, operational resiliency, operational excellence and innovation. In India, we are present in five locations and operate as BA Continuum India Private Limited (BACI), a non-banking subsidiary of Bank of America Corporation and the operating company for India operations of Global Business Services. Process Overview As a part of Global Risk Analytics, Enterprise Risk Analytics (ERA ) is responsible for the development of cross-business holistic analytical models and tools. Team responsibilities include: . Financed Emissions responsible for supporting the calculation of asset level balance sheet Financed Emissions, which are integral to the Bank s goal of achieving Net-zero greenhouse gas emissions by 2050. . Financial Crimes Modelling & Analytics responsible for enterprise-wide financial crimes and compliance surveillance model development and ongoing monitoring across all lines of business globally. . Operational Risk responsible for operational risk loss forecasting and capital model development for CCAR/stress testing and regulatory capita l reporting/economic capital measurement purpose. . Business Transformations is a central team of Project Managers and Quantitative S/W engineers partnering with coverage area ERA teams with the end goal of onboarding ERA production processes on GCP/production platforms as well as identify risk/gaps in ERA processes which can be fixed with well-designed and controlled S/W solutions. . Trade Surveillance Analytics responsible for modelling and analytics supporting trade surveillance activities within risk. . Advanced Analytics responsible for driving research, development, and implementation of new enhanced risk metrics and provide quantitative support for loss forecasting and stress testing requirements, including process improvement and automation Job Description The role will be responsible for independently conducting quantitative analytics and modeling projects Responsibilities Perform model development proof of concept, research model methodology, explore internal & external data sources, design model development data, and develop preliminary model Conduct complex data analytics on modeling data, identify, explain & address data quality issues, apply data exclusions, perform data transformation, and prepare data for model development Analyze portfolio definition, define model boundary, analyze model segmentation, develop Financed Emissions models for different asset classes, analyze and benchmark model results Work with Financed Emissions Data Team & Climate Risk Tech on the production process of model development & implementation data, including support data sourcing efforts, provide data requirements, perform data acceptance testing, etc. Work with Financed Emissions Production & Reporting Team on model implementation, model production run analysis, result analysis & visualization Work with ERA Model Implementation team & GCP Tech on model implementation, including opine on implementation design, provide implementation data model & requirements, perform model implementation result testing, etc. Work with Model Risk Management (MRM) on model reviews and obtain model approvals Work with GEG (Global Environmental Group) and FLU (Front Line Unit) on model requirements gathering & analysis, Climate Risk target setting, disclosure, analysis & reporting Requirements Education B.E. / B. Tech/M.E. /M. Tech Certifications If any : NA Experience Range : 9 to 13 years Foundational Skills. Advanced knowledge of SQL and Python Advanced Excel, VSCode, LaTex, Tableau skills Experience in multiple data environment such as Oracle, Hadoop, and Teradata Knowledge of data architecture concepts, data models, ETL processes Knowledge of climate risk, financial concepts & products Experience in extracting, and combining data across from multiple sources, and aggregate data for model development Experience in conducting quantitative analysis, performing model driven analytics, and developing models Experience in documenting business requirements for data, model, implementation, etc. Desired Skills Basics of Finance Basics of Climate Risk Work Timings 11:30 AM to 8:30 PM Job Location Hyderabad, Chennai

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies