Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Technical Specialist, AVP at Deutsche Bank in Pune, India, your main role is to maintain applications and their production processes. Your responsibilities include providing support for application jobs and functionalities in Integration, UAT, and Production environments. Additionally, you will handle time-critical requests, root cause analysis for UAT and Production abends, adhoc business request implementation, and assisting junior team members. Your key responsibilities include: - Providing extended support to L2 operations - Maintaining running status of Production processes - Ensuring timely communications to stakeholders and managing expectations - Handling time-critical requests like data manipulation, job modifications, abend resolution, etc. - Supporting and enabling team members, resolving challenges, and ensuring no blockers for delivering L3 tasks - Performing root cause analysis for abends and implementing solutions - Providing Level 3 support for technical infrastructure components - Handling tickets and guiding junior team members - Responsible for code modifications as part of business requests - Integrating software components and verifying them through testing - Providing release deployments on non-Production Management controlled environments - Managing maintenance of applications and performing technical change requests - Collaborating with colleagues in other stages of the Software Development Lifecycle - Maintaining quality of code and deliverables - Creating and maintaining application operational documents and SDLC release documents - Deploying solutions fulfilling new functionalities without causing impacts to ongoing processes In terms of qualifications, you are expected to have a Bachelor of Science degree with a concentration in Computer Science or Software Engineering, along with knowledge of technologies such as Mainframe Code Maintenance (Cobol, DB2, JCL), Mainframe L3 Application Operations, SAS, Unix, and relevant processes/tools like HP ALM, Jira, Service Now, etc. Strong analytical skills, stakeholder management, communication skills, and the ability to work in virtual teams are essential. You should also have relevant Financial Services experience and the ability to balance business demands with IT fulfillment. Deutsche Bank offers a range of benefits including a best-in-class leave policy, parental leaves, childcare assistance benefit, sponsorship for certifications, and comprehensive insurance coverage. You will also receive training, coaching, and continuous learning opportunities to excel in your career. For further information about Deutsche Bank and its culture, please visit their company website at [https://www.db.com/company/company.htm](https://www.db.com/company/company.htm). Deutsche Bank promotes a positive, fair, and inclusive work environment where employees are empowered to excel together every day.,
Posted 4 days ago
7.0 - 11.0 years
0 - 0 Lacs
maharashtra
On-site
Role Overview: You are sought after to be an experienced Data Architect who will take charge of designing and executing data lineage solutions that are in line with the business objectives of the clients. Your responsibilities will involve collaborating with various teams to ensure the accuracy, integrity, and timeliness of the data lineage solutions. You will directly engage with clients to assist them in maximizing the value of the product and achieving their contractual objectives. Key Responsibilities: - Design and implement robust data lineage solutions that support business intelligence, analytics, and data governance initiatives. - Collaborate with stakeholders to comprehend data lineage requirements and transform them into technical and business solutions. - Develop and maintain lineage data models, semantic metadata systems, and data dictionaries. - Ensure data quality, security, and compliance with relevant regulations. - Understand and implement Solidatus" best practices in data lineage modelling at client sites. - Keep updated on emerging technologies and industry trends to enhance data lineage architecture practices continuously. Qualifications: - Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. - Demonstrated experience in data architecture, focusing on large-scale data systems across multiple companies. - Proficiency in data modelling, database design, and data warehousing concepts. - Experience with cloud platforms such as AWS, Azure, GCP, and big data technologies like Hadoop and Spark. - Strong grasp of data governance, data quality, and data security principles. - Excellent communication and interpersonal skills, with the ability to work effectively in a collaborative environment. Why Join Solidatus - Contribute to an innovative company shaping the future of data management. - Collaborate with a dynamic and talented team in a supportive work environment. - Opportunities for professional growth and career advancement. - Enjoy flexible working arrangements, including hybrid work options. - Competitive compensation and benefits package. (Note: Omitting the additional details about Uplers mentioned in the JD as it is not directly related to the job role),
Posted 4 days ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
Role Overview: You will be joining as a Manager in the Data Science team, responsible for establishing standards and practices to deliver effective models with clean, secure, scalable, and maintainable code. Your role will involve leading, building, and growing the Data Science engineering team to ensure the development and delivery of quality models. Key Responsibilities: - Establish best standards and practices for you and the team to develop effective models with clean, secure, scalable, and maintainable code. - Lead, build, and grow the Data Science engineering team to enable them to build and deliver quality models. Qualifications Required: - Top-notch problem-solving skills to understand complex variables for various credit models. - Proficiency in data modelling and predictive analytics. - Ability to set-up strategies and standards for Modelling, Machine Learning, and Advanced Analytics. - Provide technical leadership and mentorship to data scientists and analytics professionals. - Previous work experience in credit underwriting, fraud prevention, and risk modeling is advantageous. - Strong learning abilities and thrive in a dynamic, collaborative, and fast-paced environment. - Excellent team player who can collaborate with cross-functional teams.,
Posted 4 days ago
14.0 - 18.0 years
0 Lacs
karnataka
On-site
As a Senior Manager - GenAI Architect at EY, you will be responsible for leading and designing advanced AI architectures that drive strategic business outcomes. Your role will involve a deep understanding of data management, application development, AI technologies, cloud solutions, and user interface design. You will collaborate closely with cross-functional teams to deliver robust and scalable solutions that align with our business objectives. **Key Responsibilities:** - **Architectural Design:** Develop and oversee data architectures and application frameworks that support AI initiatives. Ensure seamless integration of data and applications with existing systems to create cohesive and efficient solutions. - **AI Solutions:** Design and implement AI-driven models and solutions by leveraging machine learning, natural language processing, and other AI technologies to solve complex business problems. - **Cloud Integration:** Architect cloud-based solutions that support scalability, security, and performance for AI applications. Collaborate with cloud providers and internal teams to ensure optimal cloud infrastructure. - **User Interface Design:** Work with UI/UX teams to ensure that AI solutions have user-friendly interfaces and deliver an exceptional user experience. - **Leadership & Collaboration:** Lead a team of architects and engineers, providing guidance and support in the development of AI solutions. Collaborate with stakeholders across departments to align technical strategies with business goals. - **Strategic Planning:** Develop and implement architectural strategies and roadmaps that align with the company's vision and technology advancements. - **Go-To-Market Strategy (GTM):** Collaborate with onsite teams and senior architects to define and execute go-to-market strategies for AI solutions. Provide architectural insights that align with clients" needs and support successful solution development. - **Innovation & Best Practices:** Stay updated on industry trends and emerging technologies to drive innovation and implement best practices in AI architecture and implementation. **Qualifications:** - **Education:** Bachelors or Masters degree in Computer Science, Engineering, Data Science, Mathematics, or a related field. - **Experience:** Minimum of 14+ years of experience in a senior architectural role focusing on AI, data management, application development, and cloud technologies. **Technical Skills:** - Hands-on experience in deploying AI/ML solutions on different cloud platforms like Azure, AWS, and/or Google Cloud. - Experience in using and orchestrating LLM models on cloud platforms (e.g., OpenAI @ Azure/AWS Bedrock/ GCP Vertex AI or Gemini AI). - Proficiency in writing SQL and data modeling. - Experience in designing and implementing AI solutions using a microservice-based architecture. - Understanding of machine learning, deep learning, NLP, and GenAI. - Strong programming skills in Python and/or PySpark. - Proven experience in integrating authentication security measures within machine learning operations and applications. - Excellent problem-solving skills with the ability to connect AI capabilities to business value. - Strong communication and presentation skills. - Experience in AI/ML solution deployment processes on Kubernetes, Web Apps, Databricks, or similar platforms. - Familiarity with MLOps concepts and tech stack, including code versioning, MLFlow, batch prediction, and real-time endpoint workflows. - Familiarity with Azure DevOps, GitHub actions, Jenkins, Terraform, AWS CFT, etc. At EY, we are dedicated to building a better working world by creating new value for clients, people, society, and the planet while fostering trust in capital markets. Our teams, enabled by data, AI, and advanced technology, help shape the future with confidence and provide solutions for current and future challenges. Working across assurance, consulting, tax, strategy, and transactions services, EY teams leverage sector insights, a globally connected network, and diverse ecosystem partners to offer services in over 150 countries and territories.,
Posted 4 days ago
2.0 - 6.0 years
0 Lacs
haryana
On-site
Role Overview: You will be working on various innovative and ambitious projects for clients of all sizes in diverse sectors such as agri-food, commerce, distribution, luxury, energy, industry, pharma/chemistry, and health. As a passionate Power BI Consultant, after an integration period to familiarize yourself with the EY Fabernovel methodology, you will be involved in all project phases, from analyzing client needs to delivering the technical solution, including estimation and solution development. You will work in a team as well as autonomously in a technically innovative environment. Key Responsibilities: - Design, efficiently create, and maintain reports and dashboards following best practices / storytelling rules - Participate in Data Visualization using Microsoft Power BI - Estimate projects / Analyze needs and technical constraints - Data source preparation, data modeling - Client support, coaching, user training facilitation - Audit existing ecosystems (visual improvement of Dashboards, performance optimization, cost, security, etc.) - Stay updated on technological advancements - Define a self-service BI strategy and implement it Qualifications Required: - Bachelor's degree in Computer Science with a minimum of 2 years of similar experience - Required technical skills: Power BI report/dashboard visualization, advanced DAX calculations in Power BI, collaboration and sharing via Power BI Service, SQL language, data modeling (M language for Power BI) - Appreciated technical skills: Power BI Service administration, knowledge of databases (e.g., Snowflake, Google BigQuery, SQL Server, or Oracle), data warehouse modeling and design (Data Warehouses), integration tool usage (e.g., Azure Data Factory, SSIS, Talend, Alteryx) - Functional skills: Mastery of Data concepts (modeling, Value chain, different project phases, data quality), Autonomy / Teamwork (including knowledge sharing), Proactive approach / Ability to search for and find solutions, Creativity and curiosity, Dynamism and responsiveness, Service-oriented mindset, Ability to contribute to the community (internal and external) engagement, Excellent written and oral presentation, Technical English proficiency Note: The additional details of the company are not explicitly mentioned in the provided Job Description.,
Posted 4 days ago
5.0 - 9.0 years
0 Lacs
tamil nadu
On-site
Role Overview: As a Data Analyst at Standard Chartered, you will be responsible for managing all aspects of new work/project assessments from inception to delivery. Your main tasks will include building close working relationships with the chief data office to design and implement specific data requirements based on the bank's data standards. Additionally, you will collaborate with project teams, partners, and third-party vendors. Key Responsibilities: - Work closely with business users to understand their data analysis needs and requirements. Build specifications that define the business value for data analysis, approach for extracting data from Golden Sources, and business rules/logic. - Align with the Bank's Data Quality Management Framework to ensure data quality controls, governance, and compliance are in line with the Banks Data Strategy and Architecture. - Develop use cases for data analytics applications to meet various business needs. Build data models/scenarios to showcase potential insights to the business. - Partner with system and data owners to document Data Standards of individual critical data attributes. - Perform data quality controls assessment, identify gaps, and follow up with data owners for remediation. - Conduct standard/advanced profiling of data attributes to assess data quality. - Perform gap analysis against the established DQMF framework & guidelines to evaluate levels of adherence. - Support the transition of data quality and data governance capabilities into Business as Usual (BAU). - Develop a standard set of analytic tools to enable businesses to perform data analytics. - Provide data readiness reviews before implementation and manage any arising issues related to Data Visualization. This role requires strong data storytelling skills using data visualization tools like Microstrategy, Tableau, etc. Qualifications Required: - Strong query language skills including SQL, Hive, HBase, ETL (Dataiku). - Proficiency in Business Intelligence tools and Decision Support Systems. - Solid data analysis skills using Hive, Spark, Python, R, Microstrategy, and Tableau. - Experience in working with key stakeholders within the business. - Proven problem-solving skills and experience in Data Management and Data Quality Management techniques. - Stakeholder Management and Analysis abilities. - Presentation Skills for data storytelling using visualizations. - Soft Skills including Communication, Negotiation, Relationship Building, and Influencing. About Standard Chartered: Standard Chartered is an international bank known for its positive impact on clients, communities, and employees for over 170 years. The bank values diversity, challenges the status quo, and strives for continuous improvement. If you are seeking a purpose-driven career in a bank that makes a difference, Standard Chartered welcomes you to join their team. The bank's core purpose is to drive commerce and prosperity through unique diversity, advocating for inclusion and embracing differences across teams and geographies. What We Offer: - Core bank funding for retirement savings, medical and life insurance. - Flexible working options and patterns. - Proactive well-being support and continuous learning opportunities. - Inclusive and values-driven organizational culture celebrating diversity and respecting individual potential.,
Posted 4 days ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
As an Industry & Functional AI Decision Science Analyst at Accenture Strategy & Consulting within the Global Network Data & AI team, specializing in the Insurance practice, you will play a crucial role in helping clients innovate and grow their businesses through advanced analytics solutions. With job locations in Gurugram and Bengaluru, you will be at the forefront of transforming the insurance industry by leveraging cutting-edge AI and ML tools and techniques. Accenture's Global Network Insurance Data & AI Practice is dedicated to assisting clients across the insurance value chain, from Underwriting to Claims to Servicing, and Enterprise Functions. By developing analytic capabilities that include data access, reporting, predictive modeling, and Generative AI, we empower our clients to outperform their competition. Your responsibilities will include collaborating with cross-functional global teams to address complex business challenges and translate them into data-driven solutions that drive actionable insights and operational enhancements. In this role, you will have the opportunity to architect, design, build, deploy, deliver, and monitor advanced analytics models tailored to various insurance issues. By consuming data from diverse sources and presenting information in a clear and understandable manner, you will provide valuable insights to technical and non-technical stakeholders. Additionally, you will have the chance to mentor junior team members, contributing to their professional growth and development. Joining Accenture's Global Network means becoming part of a dynamic and diverse community that thrives on pushing the boundaries of business capabilities. With a focus on continuous learning and growth opportunities, you will be part of an organization deeply invested in your personal and professional development. As Accenture ranks 10th on the 2023 World's Best Workplaces list, you can expect a supportive and inclusive work environment that fosters collaboration and innovation.,
Posted 5 days ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Data Modeller, you will play a crucial role in designing, governing, and optimizing the core data models within our Data & Analytics platform. Your responsibilities will include leading the development of scalable, modular, and business-aligned models to support analytics and reporting across multiple regions and clients. Collaborating closely with data engineering, BI, and business stakeholders, you will ensure accurate embedding of business logic in the models, maintain semantic consistency, and support high-performance, secure, and compliant data structures. Your expertise will be instrumental in translating complex business requirements into robust technical models that facilitate efficient decision-making and insight generation. Working in the B2B software and services sector, you will contribute to delivering mission-critical solutions for large-scale clients globally, in accordance with our commitment to innovation and excellence. Additionally, you will support initiatives led by the GSTF, demonstrating our company's values of environmental and social responsibility. You will contribute by identifying and proposing local sustainable practices aligned with our Sustainability Charter and participate in challenges to improve sustainable behaviors through our sustainability app. Key Responsibilities: - Design, implement, and maintain scalable and modular data models for Snowflake, incorporating region and country-specific extensions without impacting the global core. - Define, document, and approve changes to the core enterprise data model, embedding business logic into model structures. - Lead data modelling workshops with stakeholders to gather requirements and ensure alignment between business, engineering, and BI teams. - Collaborate with developers, provide technical guidance, and review outputs related to data modelling tasks. - Optimize models for performance, data quality, and governance compliance. - Work with BI teams to ensure semantic consistency and enable self-service analytics. - Ensure adherence to data security, RBAC, and compliance best practices. - Utilize DevOps tools like Git/Bitbucket for version control of data models and related artifacts. - Maintain documentation, metadata, and data lineage for all models. - Preferred: Utilize tools like Matillion or equivalent ETL/ELT tools for model integration workflows. - Fulfill any additional duties as reasonably requested by your direct line leader. Required Skills: - Proven expertise in designing enterprise-level data models for cloud data platforms, preferably Snowflake. - Strong understanding of data warehouse design patterns like dimensional, Data Vault, and other modeling approaches. - Ability to embed business logic into models and translate functional requirements into technical architecture. - Experience managing and approving changes to the core data model, ensuring scalability, semantic consistency, and reusability. - Proficiency in SQL with experience in Snowflake-specific features. - Familiarity with ELT/ETL tools such as Matillion, DBT, Talend, or Azure Data Factory. - Experience with DevOps practices, including version control of modeling artifacts. - Knowledge of metadata management, data lineage, and data cataloging tools. - Strong understanding of data privacy, governance, and RBAC best practices. - Excellent communication and stakeholder engagement skills. - Positive attitude with a focus on delivering excellence. - Strong attention to detail and exceptional relationship management skills. - Open-minded consultative approach and ability to provide and receive constructive feedback. - Creative problem-solving skills and ability to work effectively in a team environment.,
Posted 5 days ago
0.0 - 3.0 years
0 Lacs
karnataka
On-site
As a software developer for our team in Bengaluru, you will be responsible for developing and optimizing dynamic web applications using the Django framework. Your expertise in following best coding practices and basic architecture design will contribute to building scalable web applications. You will deploy code on production servers and ensure security and data protection measures are implemented. Your role will involve integrating data storage solutions and developing RESTful APIs for the backend system. Proficiency in database modeling, query optimization, and version control using Git is necessary. Understanding fundamental design principles, data modeling, and entity modeling will be essential for creating scalable applications. Your responsibilities will include debugging code, adding new enhancements and features to existing applications, and optimizing applications for maximum speed, scalability, and availability. Collaboration with team members to ensure performance, responsiveness, and availability to users is vital. To excel in this role, you should have a good understanding of data structures and algorithms, familiarity with Linux operating systems, and fluency with the command line interface. Experience with optimizing web applications for improved performance and knowledge of other web-related technologies would be advantageous. This full-time position requires less than 2 years of work experience. If you are passionate about developing high-quality web applications and possess the required skills, we encourage you to apply and be part of our dynamic team.,
Posted 5 days ago
12.0 - 18.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Services & Integration Manager, you will be responsible for overseeing the technical delivery of data and relevant infrastructure, with a strong understanding of systems such as SAP, iBPM, Oracle, and Informatica. Your expertise in technology, frameworks, and accelerators like ERWIN, Sparks, Zachman, and industry data models will be essential. You should possess knowledge of Azure or AWS, and experience in catalogue & metadata management, data ownership, stewardship, and governance. Your main responsibilities will include developing a corporate data model to ensure data is treated as a reusable asset, driving consistency in data model, ownership, definition, and structure, and ensuring data connectivity across all layers. You will actively engage with data change projects to create a corporate view of data and collaborate with senior stakeholders to understand business requirements and drive execution. To qualify for this role, you should have 12 to 18 years of experience, a Bachelor's Degree in Accounting, Finance, Business, or a relevant data modelling certification. Additionally, possessing an architecture certification such as TOGAF would be beneficial. If you are passionate about data integration and have a strong background in data services, this role in Pune offers you the opportunity to make a significant impact. For more information on this exciting opportunity, please contact 85916 09735 or email priyanshu@mmcindia.biz.,
Posted 5 days ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
As a Data Engineer at Birlasoft, a global leader in Cloud, AI, and Digital technologies, you will play a crucial role in designing and developing data transformations and data models. Your primary responsibility will be to ensure reliable and efficient data processing and analysis to support data-driven decision-making processes. Working closely with cross-functional teams, you will contribute to the overall success of our insights teams. Your key proficiency should include expertise in DBT (Data Build Tool) for data transformation and modelling. You must demonstrate proficiency in Snowflake, including experience with Snowflake SQL and data warehousing concepts. A strong understanding of data architecture, data modelling, and data warehousing best practices is essential for this role. In this position, you will design, develop, and maintain robust data pipelines using DBT and Snowflake. You will be responsible for implementing and optimizing data ingestion processes to ensure efficient and accurate data flow from various sources. Collaboration with data scientists, analysts, and stakeholders is crucial to understand data requirements and ensure data integrity and quality. As a Data Engineer, you should have proven experience in data ingestion and ETL processes. Experience with other ETL tools and technologies like Apache Airflow, Talend, or Informatica is a plus. Proficiency in SQL and experience with programming languages such as Python or Java are required. Familiarity with cloud platforms and services, especially AWS, and experience with AWS Lambda is a must. You are expected to adhere to and promote development best practices, including version control using Git and branching models. Code review to ensure consistent coding standards and practices is part of your responsibilities. Participation in scrum methodology, including daily stand-ups, sprint planning, and retrospectives, is essential. Effective communication with team members and stakeholders to understand requirements and provide updates is also key. Ownership of assigned tasks and the ability to work independently to complete them are characteristics that will contribute to your success in this role. Staying up to date with the latest trends and technologies in data engineering, DBT, and Snowflake is important to ensure continuous improvement and innovation in your work.,
Posted 5 days ago
4.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
At Solidatus, we are revolutionizing the way organizations comprehend their data. We are an award-winning, venture-backed software company often referred to as the Git for Metadata. Our platform enables businesses to extract, model, and visualize intricate data lineage flows. Through our unique lineage-first approach and active AI development, we offer organizations unparalleled clarity and robust control over their data's journey and significance. As a rapidly growing B2B SaaS business with fewer than 100 employees, your contributions play a pivotal role in shaping our product. Renowned for our innovation and collaborative culture, we invite you to join us as we expand globally and redefine the future of data understanding. We are currently looking for an experienced Data Pipeline Engineer/Data Lineage Engineer to support the development of data lineage solutions for our clients" existing data pipelines. In this role, you will collaborate with cross-functional teams to ensure the integrity, accuracy, and timeliness of the data lineage solution. Your responsibilities will involve working directly with clients to maximize the value derived from our product and assist them in achieving their contractual objectives. **Experience:** - 4-10 years of relevant experience **Qualifications:** - Proven track record as a Data Engineer or in a similar capacity, with hands-on experience in constructing and optimizing data pipelines and infrastructure. - Demonstrated experience working with Big Data and related tools. - Strong problem-solving and analytical skills to diagnose and resolve complex data-related issues. - Profound understanding of data engineering principles and practices. - Exceptional communication and collaboration abilities to work effectively in cross-functional teams and convey technical concepts to non-technical stakeholders. - Adaptability to new technologies, tools, and methodologies within a dynamic environment. - Proficiency in writing clean, scalable, and robust code using Python or similar programming languages. Background in software engineering is advantageous. **Desirable Languages/Tools:** - Proficiency in programming languages such as Python, Java, Scala, or SQL for data manipulation and scripting. - Experience with XML in transformation pipelines. - Familiarity with major Database technologies like Oracle, Snowflake, and MS SQL Server. - Strong grasp of data modeling concepts including relational and dimensional modeling. - Exposure to big data technologies and frameworks such as Databricks, Spark, Kafka, and MS Notebooks. - Knowledge of modern data architectures like lakehouse. - Experience with CI/CD pipelines and version control systems such as Git. - Understanding of ETL tools like Apache Airflow, Informatica, or SSIS. - Familiarity with data governance and best practices in data management. - Proficiency in cloud platforms and services like AWS, Azure, or GCP for deploying and managing data solutions. - Strong problem-solving and analytical skills for resolving complex data-related issues. - Proficiency in SQL for database management and querying. - Exposure to tools like Open Lineage, Apache Spark Streaming, Kafka, or similar for real-time data streaming. - Experience utilizing data tools in at least one cloud service - AWS, Azure, or GCP. **Key Responsibilities:** - Implement robust data lineage solutions utilizing Solidatus products to support business intelligence, analytics, and data governance initiatives. - Collaborate with stakeholders to comprehend data lineage requirements and translate them into technical and business solutions. - Develop and maintain lineage data models, semantic metadata systems, and data dictionaries. - Ensure data quality, security, and compliance with relevant regulations. - Uphold Solidatus implementation and data lineage modeling best practices at client sites. - Stay updated on emerging technologies and industry trends to enhance data lineage architecture practices continually. **Qualifications:** - Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. - Proven experience in data architecture, focusing on large-scale data systems across multiple companies. - Proficiency in data modeling, database design, and data warehousing concepts. - Experience with cloud platforms (e.g., AWS, Azure, GCP) and big data technologies (e.g., Hadoop, Spark). - Strong understanding of data governance, data quality, and data security principles. - Excellent communication and interpersonal skills to thrive in a collaborative environment. **Why Join Solidatus ** - Participate in an innovative company that is shaping the future of data management. - Collaborate with a dynamic and talented team in a supportive work environment. - Opportunities for professional growth and career advancement. - Flexible working arrangements, including hybrid work options. - Competitive compensation and benefits package. If you are passionate about data architecture and eager to make a significant impact, we invite you to apply now and become a part of our team at Solidatus.,
Posted 5 days ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. We are seeking an experienced Business Intelligence Developer with 8+ years of experience having expertise in Oracle Analytics Cloud (OAC), PowerBI, ETL tools, and Data Modeling to join our dynamic team. The successful candidate will be responsible for developing and maintaining scalable data models, creating insightful analytics dashboards, and managing ETL workflows to support data-driven decision-making across the organization. They will work closely with customers, data architects, software developers, and business analysts for suitable product development. The candidate will be a highly skilled individual and will be accountable for their career development and growth in EY. Responsibilities: - Collaborate with stakeholders to understand data requirements and translate business needs into data models. - Design and implement effective data models to support business intelligence activities. - Develop and maintain ETL processes to ensure data accuracy and availability. - Create interactive dashboards and reports using Oracle Analytics Cloud (OAC) and PowerBI. - Work with stakeholders to gather requirements and translate business needs into technical specifications. - Optimize data retrieval and develop dashboard visualizations for performance efficiency. - Ensure data integrity and compliance with data governance and security policies. - Collaborate with IT and data teams to integrate BI solutions into the existing data infrastructure. - Conduct data analysis to identify trends, patterns, and insights that can inform business strategies. - Provide training and support to end-users on BI tools and dashboards. - Document all processes, models, and activities to maintain transparency and facilitate knowledge sharing. - Stay up to date with the latest BI technologies and best practices to drive continuous improvement. Qualifications: - Bachelors degree in computer science, Information Systems, Business Analytics, or a related field. - Proven experience with Oracle Analytics Cloud (OAC), PowerBI, and other BI tools. - Strong experience in ETL (SSIS, Informatica, Dell Boomi etc) processes and data warehousing solutions. - Proficiency in data modeling techniques and best practices. - Solid understanding of SQL and experience with relational databases. - Familiarity with cloud platforms and services (e.g., AWS, Azure, Google Cloud). - Excellent analytical, problem-solving, and project management skills. - Ability to communicate complex data concepts to non-technical stakeholders. - Detail-oriented with a strong focus on accuracy and quality. - Well-developed business acumen, analytical and strong problem-solving attitude with the ability to visualize scenarios, possible outcomes & operating constraints. - Strong consulting skills with proven experience in client and stakeholder management and collaboration abilities. - Good communication skills both written and oral, ability to make impactful presentations & expertise at using excel & PPTs. - Detail-oriented with a commitment to quality and accuracy. - Good to have knowledge of data security and controls to address customer's data privacy needs in line with regional regulations such as GDPR, CCPA, etc. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,
Posted 5 days ago
5.0 - 9.0 years
5 - 9 Lacs
hyderabad, pune
Work from Office
Strong financial domain and data analysis skills with experience covering activities like requirement gathering, elicitation skills, gap analysis, data analysis, effort estimation, reviews Required Candidate profile Ability to translate the high-level functional data or business requirements into technical solution, database designing and data mapping Strong Data Analysis and/or Data Modelling with Erwin
Posted 5 days ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You will be joining LSEG (London Stock Exchange Group), a diversified global financial markets infrastructure and data business that is committed to delivering exceptional services to its customers worldwide. With a legacy of over 300 years, LSEG has been instrumental in supporting the financial stability and growth of communities and economies globally. As a Senior Business Analyst in the Infrastructure Platform Engineering (IPE) team, your role will be pivotal in driving strategic workforce planning and management as part of the broader IPE transformation program. Your primary responsibility will involve designing workforce data models that integrate forecast, actuals, cost models, and strategic baselines to facilitate informed decision-making within IPE. By delivering insights on workforce trends, skills distribution, and hiring strategies, you will contribute to long-term planning and talent development initiatives. Additionally, you will lead the workforce audit process, ensuring governance traceability for changes across workforce plans, and collaborate closely with transformation leads, product owners, and engineering teams to ensure data-backed decision-making aligned with strategic priorities. Your role will also encompass building and maintaining dashboards and reporting tools for workforce and transformation program governance, engaging with stakeholders across departments to collect and translate customer requirements into actionable backlog items. Operating within agile squads or cross-functional delivery teams, you will play a key role in analyzing operational processes, identifying areas for improvement, and designing future-state workflows to support the new operating model. To excel in this position, you will need a Bachelor's degree in engineering, Computer Science, or equivalent experience, along with 5+ years of technical business analysis experience in transformation or large-scale program delivery. Proficiency in data modeling, visualization tools such as Excel and Power BI, and essential tooling knowledge of Microsoft Tools and planning software will be crucial for success in this role. Strong attention to detail, analytical skills, and a proactive attitude to challenge assumptions and contribute to innovative solutions will be highly valued. By joining LSEG, you will play a significant role in a strategic transformation program, gaining exposure to modern delivery models, collaboration with cross-functional teams, and a culture that values data-driven thinking and continuous improvement. You can look forward to a competitive compensation package, comprehensive benefits, continuous learning opportunities, and a dynamic work environment that encourages innovation and diversity. If you are ready to make a meaningful impact and advance your career in a global financial markets infrastructure organization, we invite you to join us at LSEG.,
Posted 6 days ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
As a Senior Audience Specialist, you will collaborate with marketing teams to understand campaign objectives and translate them into audience segments. You will work closely with data analysts to analyze data quality and accuracy, ensuring its usability for segmentation purposes. Effective communication with both technical and non-technical stakeholders is essential. You will adhere to industry standards for data handling, including PII data storage, transfer, and disposal procedures. Utilizing AEP for segment creation, PII data handling compliance, and data governance will be a key responsibility. Proficiency in Adobe Campaign Classic for audience segmentation, audience enrichment, data handling, sideload processes, and creating complex multi-channel workflows is required. Collaboration with other teams to manage the full Marketing to Sales funnel for Prospect/Growth audiences is expected if you have the relevant experience. Importing audiences or connecting to external systems through APIs such as Postman is also a part of the role for experienced professionals. Your role will involve utilizing your proficiency in marketing principles and campaign strategies, as well as working closely with Data Analysts. Familiarity with PII data storage, transfer, and disposal procedures, experience in and working knowledge of AEP and Adobe Campaign Classic, and the ability to create complex multi-channel workflows will be essential for success in this position. In this role, you will need to have experience with Lead nurture workstreams, data modelling, and the creation of audience cohorts to optimize reach. Experience with importing audiences or connecting to external systems through APIs (such as Postman), working knowledge of JavaScript, and experience in Adobe Journey Optimizer are also important. This includes using journeys versus campaigns, journey orchestration, defining triggers and actions, and working with Custom actions.,
Posted 6 days ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
At MediaKind, we are reshaping the future of streaming video by leveraging our cloud-native, agile technology to enable customers to rapidly build, deploy, and scale premium streaming experiences. Our goal is to transform how the world watches video, ensuring that every moment is extraordinary, personalized, and valuable for content owners and consumers alike. As part of our dynamic team, you will be contributing to redefining how media experiences are brought to life. Our award-winning products simplify complex operations, allowing customers - from iconic sports brands to innovative broadcasters and content owners - to focus on storytelling and growth. By replacing legacy complexity with cloud-connected solutions, we prioritize speed, simplicity, and commercial transparency. We value practical thinking over buzzwords and velocity over bureaucracy. Our strength lies not only in technology but also in our people. We are dedicated to fostering a passionate community of creators, developers, and artists who are truly passionate about what they do. Together, we are redefining the art of streaming. As a Data Analyst joining our master data and reporting team, you will play a crucial role in providing MediaKind with a solid foundation to enhance our reporting and data analytics tools. Your focus will be on process optimization, system standardization, data integrity, and creative analytics. Working with various stakeholders, you will drive sustainable improvements in data quality, ensuring that data operates effectively and efficiently. We are seeking a delivery-focused and self-motivated Data Analyst to collaborate with our team and contribute to developing Tableau projects while eliminating poor data quality. Your responsibilities will include developing, maintaining, and managing advanced reporting, analytics, dashboards, and other BI solutions, as well as performing data analysis, validation, and mapping. You will collaborate with IT teams and business experts to implement data-driven policies and initiatives, create reports using graphical and data modeling techniques, and gather and normalize data from multiple sources. At MediaKind, we are committed to creating an inclusive workplace that values the unique skills, capabilities, and perspectives of our employees. We believe that diverse teams are beneficial for our employees, customers, and business. If you are excited about this position and believe you can add value to MediaKind, we encourage you to apply, even if you feel your skills are not a perfect fit. We aim to increase diversity across the Media Broadcast industry and strive to make the recruitment process as accessible as possible. If you require any reasonable adjustments during the recruitment process, please reach out to the Recruiter to discuss how we can support you.,
Posted 6 days ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
Arthur J. Gallagher & Co. is a global leader in insurance, risk management, and consulting with operations supported by the Gallagher Center of Excellence (GCoE) in India. The GCoE in Bengaluru, Mumbai, Pune, Kolhapur, and Shimoga serves worldwide brokerage, claims, underwriting, finance, analytics, automation, and shared services. This blend of global expertise and local talent enables the delivery of innovative, client-focused solutions. We are looking for an experienced MDM Data Analyst / Data Steward Lead to oversee and enhance master data quality throughout the organization. The role entails collaborating with stakeholders to define data requirements, establish governance policies, and ensure data accuracy by utilizing cutting-edge MDM tools. This position is ideal for individuals who are enthusiastic about data governance, quality, and integration. Responsibilities: - Develop expertise in the assigned domain of data - Comprehend all source systems that contribute to the MDM - Document stewardship for the domain - Create rules and standards for data domain - Produce metrics indicating data quality enhancements to showcase to the business Requirements: - Proficiency in MDM tools and technologies like Informatica MDM, CluedIn, or similar platforms - Ability to translate complex business requirements into practical MDM solutions - Deep understanding of data modeling, governance, quality, and integration principles - Strong stakeholder management capabilities - Experience required: 8-12 Years - Location: Bangalore Work Shift: - UK shift (3 PM - 12 AM) - 1 week work from the office, 3 weeks WFH Join us at Arthur J. Gallagher & Co. to be a part of a dynamic team that values excellence in data management and is committed to delivering exceptional solutions to our clients.,
Posted 6 days ago
3.0 - 8.0 years
0 Lacs
maharashtra
On-site
Join us as a Senior Business Analyst & Data Modeler (BA-DM) where you will play a critical role in bridging the gap between business requirements and technical implementation, ensuring that regulatory data solutions meet the Regulatory Reporting expectations. Also, ensuring that the data analytics solutions deliver unparalleled customer insights, in full compliance with Barclays Data, security, accuracy and efficiency standards. You may be assessed on the key critical skills relevant for success in role, such as strong experience in business analysis (involving Investment Bank) and data modelling in delivering regulatory data projects. As an ideal candidate you will have a strong background in business analysis, data modelling, and regulatory reporting frameworks within the investment banking domain. To be successful as a BA-DM for Investment Bank, you should have a strong understanding of investment banking products as well as trade life cycle. Additionally, a strong understanding of regulatory frameworks and data governance frameworks applicable to investment banks is essential. You should also possess experience in working on large-scale regulatory data projects in the investment bank, including requirements gathering, data analysis, and modelling. Furthermore, the ideal candidate will have 8+ years of experience as a business analyst, detailing functional and non-functional requirements, along with 3+ years of experience in developing and optimizing data models with industry standard tools and implementing data quality rules. A deep understanding of relational databases, data warehouse design, and ETL or ELT processes is crucial. Proficiency in SQL, Excel, and Python for data analysis and validation is required. The role also involves defining and executing test cases to validate data outputs and assisting in User Acceptance Testing (UAT). Desirable skillsets/good to have include strong communication and presentation skills, excellent problem-solving skills with attention to detail, ability to work under pressure and manage multiple priorities in a fast-paced environment, knowledge of data visualization tools such as Tableau and Power BI, familiarity with cloud-based data platforms like AWS, Azure, or GCP, and familiarity with big data technologies like Apache Spark, Starburst, Iceberg, DBT, object store, Databricks, and Airflow. Professional certifications such as CBAP, TOGAF, or DAMA-CDMP are a plus. Experience with Agile project management methodologies is preferred. This role will be based out of Nirlon Knowledge Park/Altimus office, Mumbai. Purpose of the Role: To design, develop, and implement solutions to complex business problems, collaborating with stakeholders to understand their needs and requirements. Design and implement solutions that meet those needs and create solutions that balance technology risks against business delivery, driving consistency. Accountabilities: - Design and develop solutions as products that can evolve to meet business requirements aligning with modern software engineering practices. - Targeted design activities that apply an appropriate workload placement strategy and maximise the benefit of cloud capabilities. - Develop best practice designs incorporating security principles and meeting the Bank's resiliency expectations. - Balance risks and controls to deliver agreed business and technology value. - Adoption of standardised solutions where applicable and contributing to their ongoing evolution. - Provide fault finding and performance issues support to operational support teams. - Perform solution design impact assessment in terms of risk, capacity, and cost impact, including estimation of project change and ongoing run costs. - Develop architecture inputs required to comply with the bank's governance processes. Vice President Expectations: - Contribute or set strategy, drive requirements, and make recommendations for change. - Manage resources, budgets, and policies; deliver continuous improvements and escalate breaches of policies/procedures. - For individuals with leadership responsibilities, demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to an excellent standard. - Advise key stakeholders on functional and cross-functional areas of impact and alignment. - Manage and mitigate risks through assessment in support of the control and governance agenda. - Demonstrate leadership and accountability for managing risk and strengthening controls related to the team's work. - Collaborate with other areas of work to keep up with business activity and strategies. - Create solutions based on sophisticated analytical thought and in-depth analysis with interpretative thinking to define problems and develop innovative solutions. - Seek out, build, and maintain trusting relationships with internal and external stakeholders to achieve key business objectives. All colleagues are expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as the Barclays Mindset to Empower, Challenge, and Drive.,
Posted 6 days ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior BI Developer, you will be responsible for designing and delivering enterprise-level business intelligence solutions. Your main focus will be on building scalable BI models, reports, and dashboards that provide accurate, timely, and actionable insights for various regions and business units. Collaboration with stakeholders, data engineering teams, and data modelling teams is a key aspect of this role to ensure that BI solutions are aligned with business requirements, maintain consistency, and comply with governance standards. In addition, you will play a crucial role in mentoring junior BI developers and contributing to best practices across BI initiatives. With a minimum of 5 years of BI development experience, including at least 3 years in a senior/lead role, you should possess strong expertise in Power BI (mandatory) along with additional skills in tools like Tableau. Proficiency in advanced SQL and experience with cloud data platforms, preferably Snowflake, is essential. A solid understanding of data modelling principles such as dimensional, Data Vault, and semantic layers is required. Your responsibilities will include leading enterprise-scale BI projects that align with business objectives, designing, developing, and maintaining BI datasets, reports, and dashboards (Power BI required; Tableau preferred), managing BI data models for scalability and compliance, optimizing reports and dashboards for large-scale environments, integrating BI solutions with ETL/ELT pipelines, and applying DevOps practices for version control and deployments. As part of the role, you will engage with stakeholders, facilitate workshops, mentor junior developers, and ensure the quality and standards of BI development are met. A Bachelor's degree in Computer Science, Data Management, Analytics, or a related field is expected. In return, you will have the opportunity for growth in leadership, technical, and commercial skills, career progression in a global, high-growth environment, and a collaborative, innovative, and vibrant work culture. A competitive compensation and benefits package is also on offer.,
Posted 6 days ago
6.0 - 10.0 years
0 Lacs
coimbatore, tamil nadu
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. We are looking for Manager Level Architects with expertise in Generative AI who can work on end-to-end pipelines, enabling data curation, building and fine-tuning Generative AI models, and deploying them into scalable production streams. This is a fantastic opportunity to be part of a leading firm whilst being instrumental in the growth of a new service offering. This role demands a highly technical, extremely hands-on experience who will work closely with our EY Partners and external clients to develop new business as well as drive other initiatives on different business needs across different Customer and SC&O domains. The ideal candidate must have a good understanding of the problems and use cases that can be solved with Gen AI models with closest accuracy with Supply Chain industry knowledge and proven experience in delivering solutions to different lines of business and technical leadership. Your key responsibilities include collaborating with EY Supply Chain & Operations stakeholders and data engineering teams to enable high-quality data curation for model training and evaluation, designing, developing, and fine-tuning Generative AI models, implementing solutions for data pre-processing, tokenization, and dataset augmentation, deploying Generative AI models on cloud platforms or edge devices, working on MLOps pipelines, conducting performance benchmarking, hyperparameter tuning, and optimization, and staying updated on the latest trends and advancements in Generative AI. To qualify for the role, you must have 6-10 years of experience in ML, MLOps, Generative AI LLMs as Developer/Lead/Architect, expertise in Data Engineering, Data transformation, curation, feature selection, ETL Mappings, Data Warehouse concepts, thorough knowledge in SQL, Python, PySpark, Spark, and other languages, experience in developing end-to-end GenAI solutions with the capability to migrate the solution for Production, knowledge of Cloud like Azure, AWS, GCP, etc., knowledge of frameworks like LangChain, Hugging Face, Azure ML Studio, Azure, knowledge of data modeling and Vector DB management, and experience in designing and developing complex flows and processes. Skills and attributes for success include proficiency in Python with focus on AI/ML libraries and frameworks like LangChain, TensorFlow, PyTorch, or Hugging Face Transformers, experience with data pipelines and tools like Spark, Snowflake, or Databricks, strong hands-on experience deploying AI models on cloud platforms, expertise in Model Development with in-depth knowledge of LLMs, and the ability to mentor developers and contribute to architectural decisions. Ideally, you'll also have a strong understanding of Customer and Supply Chain process and operations, knowledge of Programming concepts, Cloud Concepts, LLM Models, design, and coding, expertise in data handling to resolve any data issues as per client needs, experience in designing and developing DB objects and Vector DBs, experience of creating complex SQL queries, PySpark code, Python Scripting for retrieving, manipulating, checking, and migrating complex datasets, and good verbal and written communication in English. At EY, you'll be part of a market-leading, multi-disciplinary team of professionals, and you'll have opportunities to work with leading businesses across a range of industries. EY offers support, coaching, and feedback, opportunities to develop new skills and progress your career, and the freedom and flexibility to handle your role in a way that's right for you.,
Posted 6 days ago
6.0 - 10.0 years
0 Lacs
coimbatore, tamil nadu
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. As an AI Automation Architect for Customer Business Process Manager, you will be responsible for working on end-to-end pipelines, enabling data curation, building and fine-tuning Generative AI models, and deploying them into scalable production streams. This role offers a fantastic opportunity to be part of a leading firm and instrumental in the growth of a new service offering within the Supply Chain Technology group of our GDS consulting Team. You will collaborate with EY Supply Chain & Operations stakeholders and data engineering teams to enable high-quality data curation for model training and evaluation. Design, develop, and fine-tune Generative AI models using frameworks like LangChain, Hugging Face, TensorFlow, or PyTorch. Implement solutions for data pre-processing, tokenization, and dataset augmentation. Deploy Generative AI models on cloud platforms (e.g., AWS, GCP, Azure) or edge devices, ensuring scalability and robustness. Work on MLOps pipelines, including CI/CD workflows, model monitoring, and retraining strategies. Conduct performance benchmarking, hyperparameter tuning, and optimization to improve model efficiency. Stay updated on the latest trends and advancements in Generative AI and integrate best practices into project workflows. To qualify for this role, you must have 6-10 years of experience in ML, MLOps, Generative AI LLMs as a Developer/Lead/Architect. Expertise in Data Engineering, Data transformation, curation, feature selection, ETL Mappings, Data Warehouse concepts. Thorough knowledge in SQL, Python, PySpark, Spark, and other languages. Experience in developing end-to-end GenAI solutions with the capability to migrate the solution for Production. Knowledge of Cloud platforms like Azure, AWS, GCP, etc. Knowledge of frameworks like LangChain, Hugging Face, Azure ML Studio Azure. Knowledge of data modeling and Vector DB management and modeling. Design and develop complex flows and processes. Ideally, you'll also have a good understanding of Customer and Supply Chain processes and operations, strong knowledge of Programming concepts, Cloud Concepts, LLM Models, design, and coding. Experience in data handling to resolve any data issues as per client needs. Experience in designing and developing DB objects and Vector DBs. Experience of creating complex SQL queries, PySpark code, Python Scripting for retrieving, manipulating, checking, and migrating complex datasets. Experience in Model selection and tuning of the model. Good verbal and written communication in English, Strong interpersonal, analytical, and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents. Candidates having additional knowledge of BI tools such as PowerBi, Tableau, etc will be preferred. Experience with Cloud databases and multiple ETL tools. At EY, you'll have the opportunity to drive Generative AI and ML-related developments, with additional knowledge of data structures, preferably in the Supply Chain Industry. You'll be part of a market-leading, multi-disciplinary team of 10,000+ professionals, working with leading businesses across a range of industries globally. EY offers support, coaching, and feedback from engaging colleagues, opportunities to develop new skills and progress your career, and the freedom and flexibility to handle your role in a way that's right for you. EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets.,
Posted 6 days ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
As a member of the DBS Transformation Group, you will be integral to nurturing the culture of the World's Best Bank as recognized by Euromoney in 2018, 2019, and 2020. Our team's approach combines science and art, immersing stakeholders in design thinking and experimentation. We drive rigorous creativity along our innovation pipeline and foster connections between corporate entrepreneurs and start-ups. Our cross-disciplinary team is dedicated to inventing solutions that significantly enhance people's lives, work, and leisure activities. We are deeply passionate and committed to making banking a joyful experience, all while having a great deal of fun! Your responsibilities will include assisting in the development and finalization of presentations for senior management, the DBS Group Board, regulators, and other stakeholders. This will involve showcasing India's performance against budgets and peer banks, among other content. You will prepare presentation content with a strategic mindset, analyzing financial results and key trends to provide a big-picture perspective. Additionally, you will be responsible for monthly/quarterly/semi-annual reporting in specified formats, managing the DBS India scorecard, business reviews, group-wide reviews, town halls, strategic projects, and reports required by the CEO's office. Your role will also involve supporting the execution of DBS India's strategy within business/support units and aiding the CEO's office with external and regional reporting and reviews. You will play a crucial part in delivering on DBS India's Strategy and Planning plan and deliverables, conducting financial analysis of companies" balance sheets and profit and loss statements from a strategic investment and partnership viewpoint, developing financial models for assessing companies/prospects before strategic investment, creating tools for tracking investments/valuation and monitoring governance, and building forecasting tools for evaluating business performance and focus areas. To excel in this role, you must possess strong communication skills, a good understanding of technology tools (especially those related to analytics, data, and modeling), attention to detail, a hunger for learning, and a strong sense of ownership.,
Posted 6 days ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
You are a highly skilled Sr. Developer with 6 to 10 years of experience in Ignition Full Stack and Data Modelling. Your strong understanding of IT domain knowledge, preferably with experience in Industrial Manufacturing, will enable you to contribute significantly to projects, enhancing technological capabilities and driving innovation. This hybrid role requires working in a day shift with no travel obligations. Your responsibilities will include developing and maintaining high-quality software solutions using Ignition Full Stack, collaborating with cross-functional teams to design and implement data models, analyzing complex data sets to identify trends, ensuring software development activities adhere to industry best practices, troubleshooting technical issues, providing guidance to junior developers, participating in code reviews, staying updated with industry trends, contributing to project documentation, gathering and refining requirements with stakeholders, implementing security measures, optimizing software performance, and engaging in continuous improvement initiatives. To qualify for this role, you must have a strong background in Ignition Full Stack development, expertise in data modeling techniques, a solid understanding of IT domain knowledge, proficiency in collaborating with cross-functional teams, excellent problem-solving skills, and a keen interest in staying updated with industry trends and technologies. Certification Required: Certified Ignition Developer,
Posted 6 days ago
5.0 - 9.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Senior Technical Consultant in this role, you will be responsible for utilizing your 5+ years of relevant experience, particularly with ServiceNow - ITOM Service Mapping skills. Your primary focus will involve mapping data from various systems into ServiceNow, ensuring accurate and efficient transfer of information, and facilitating integrations with other tools and services. Your key responsibilities will include designing, developing, and deploying service maps, collaborating with discovery patterns, and guaranteeing the accurate representation of IT services and their underlying infrastructure. A strong understanding of the CMDB, its structure, and its relationship to ITOM will be essential. You will also be developing and implementing data mapping solutions to enable the seamless movement and integration of data between ServiceNow and external systems. In this role, you will work closely with development and integration teams to ensure a smooth data flow between ServiceNow and other applications. Your tasks will involve configuring and customizing ServiceNow modules and workflows as per specific business requirements, supporting configuration changes related to mapping and data transformation needs, and conducting unit and integration testing to validate the accuracy of data flows. Additionally, you will be responsible for maintaining comprehensive documentation of data mappings, workflows, and integrations, providing ongoing support and troubleshooting for data mapping and integration issues, and collaborating with business analysts, developers, and stakeholders to comprehend data requirements and business processes effectively. Your qualifications should include a Bachelor's degree along with at least 7 years of experience in ServiceNow ITOM Discovery & Service Mapping and integrating data. Hands-on experience with ServiceNow ITOM (CMDB, Service Mapping) and proficiency in ServiceNow scripting (JavaScript) for mapping and transforming data are crucial. You should also possess knowledge of data modeling and mapping techniques in ServiceNow, familiarity with SQL and querying relational databases, and experience with integration protocols such as REST, SOAP, FTP, and SFTP. Moreover, you are expected to hold relevant certifications including ServiceNow Certified ITOM Discovery & Service Mapping, ServiceNow Certified - Application Developer, and ITIL v3 certification. Excellent analytical and problem-solving skills, strong verbal and written communication skills, effective presentation development and customer presentation skills, successful teamwork experience, and demonstrated leadership abilities will be valuable assets in this role.,
Posted 6 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |