Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 8.0 years
5 - 15 Lacs
bengaluru
Hybrid
Highly Looking for Immediate to 15 Days Notice period Only Role & responsibilities SQL Proficiency• Writing complex queries, joins, subqueries, and aggregations Optimizing query performance for large datasets Data Modeling & Schema Understanding• Relational schema design (Oracle, MySQL) Document-based schema design (MongoDB) ETL & Data Transformation• Extracting data from multiple sources Cleaning, transforming, and loading into reporting layers Report Automation & Scheduling• Using tools like Oracle Scheduler, cron jobs, or BI platforms Automating refresh cycles and alerts these are basic .. Oracle-Specific Skills PL/SQL Development• Writing stored procedures, functions, and packages Exception handling and performance tuning Oracle BI Tools• Oracle BI Publisher, Oracle Analytics Cloud Integration with dashboards and enterprise reporting Materialized Views & Indexing• Creating summary tables for faster reporting Managing refresh strategies any of these are important ..
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
The Informatica Axon Analyst is a key player in ensuring effective management of data governance and data management initiatives within the organization. Acting as a liaison between IT and business stakeholders, you will align metadata requirements with overall business strategies. Your responsibilities will include administering and optimizing the Informatica Axon platform to classify, manage, and govern data appropriately. Your expertise in data governance principles and technical skills will be crucial in maintaining data quality and integrity and promoting a data-driven decision-making culture. Defining and implementing data stewardship processes to ensure compliance with regulations and policies will also be part of your role. By utilizing your analytical and problem-solving abilities, you will contribute to enhancing the organization's data system and operational excellence. You will be tasked with various responsibilities, including administering and maintaining the Informatica Axon platform, collaborating with cross-functional teams to establish data stewardship roles, gathering and analyzing business requirements, creating and managing data dictionaries, glossaries, and taxonomies, monitoring data quality metrics, designing and implementing data governance frameworks, conducting training sessions, facilitating workshops, developing continuous improvement processes, ensuring compliance with industry standards and regulations, assisting in data mapping and lineage initiatives, generating reports and dashboards, supporting data integration projects, identifying automation opportunities, actively participating in data governance council meetings, and serving as the primary contact for Axon-related inquiries and troubleshooting. To qualify for this role, you should have a Bachelor's degree in Computer Science, Information Technology, or a related field, along with a minimum of 3 years of experience in data governance or data management roles. Strong proficiency in working with the Informatica Axon platform, data governance frameworks, SQL, data modeling concepts, and data visualization tools is required. Excellent analytical, problem-solving, communication, and interpersonal skills are essential. Project management experience and certifications in data management or data governance are preferred. Proficiency in metadata management, data lineage concepts, handling multiple projects, knowledge of data privacy and compliance regulations, attention to detail, quality assurance mindset, and a willingness to stay updated with evolving data governance tools and techniques are also necessary for success in this role.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
The ETL Developer with expertise in Datastage and Snowflake plays a crucial role in the organization by managing and improving the data pipeline initiatives. You will be responsible for extracting, transforming, and loading data from diverse sources to the Snowflake environment, ensuring high-quality data availability for analytics and reporting. As companies increasingly rely on data-driven decisions, the need for a skilled ETL Developer has grown significantly. You will work closely with data analysts, data scientists, and other stakeholders to understand business requirements and translate them into robust ETL processes. By optimizing and maintaining data pipelines, you will enhance data accessibility and efficiency, thus driving informed decision-making and strategic initiatives within the company. Design, develop, and maintain ETL processes using Datastage and Snowflake. Collaborate with data architects and analysts to gather requirements and specifications. Extract data from multiple sources, ensuring integrity and security. Transform data according to business needs, applying rules and practices. Load transformed data into Snowflake, optimizing for performance and efficiency. Monitor ETL jobs for performance, troubleshoot issues, and ensure timely execution. Implement data quality checks and validation processes. Document ETL processes, data flows, and transformations for future reference. Work with the data team to design and implement scalable data models. Enhance existing ETL processes for better performance and reliability. Conduct root cause analysis for data discrepancies or failure in data pipelines. Stay updated with new technologies and trends related to ETL and data warehousing. Train and guide junior team members on ETL best practices. Participate in data governance initiatives, ensuring compliance with policies. Adopt Agile methodologies in project execution for efficient workflow. Required Qualifications: - Bachelor's degree in Computer Science, Information Technology, or related field. - 5+ years of experience working as an ETL Developer or in a similar role. - Proficient in Datastage, with a solid understanding of its functionalities. - Experience with Snowflake and cloud-based data solutions. - Strong command of SQL and relational databases. - Familiarity with data modeling concepts and dimensional modeling. - Experience with performance tuning and optimization of ETL jobs. - Knowledge of data governance and data quality frameworks. - Ability to work collaboratively in a team-oriented environment. - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Experience with Agile/Scrum methodologies preferred. - Understanding of big data technologies and frameworks is a plus. - Certifications in data warehousing or ETL tools are advantageous. - Willingness to learn new technologies and tools as needed. - Attention to detail with a focus on maintaining data integrity.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
The Data Modeler - Erwin plays a crucial role in the design and implementation of data models that meet organizational needs. You will be responsible for translating business requirements into well-structured, reusable data models while ensuring data integrity and efficiency. Working closely with stakeholders, you will gather data requirements and translate them into models that provide input for database architectures. Utilizing the Erwin tool, you will enhance data management strategies and ensure compliance with governance standards. Your role is vital as it supports the company's ability to make data-driven decisions and derive insights that align with strategic objectives. Key Responsibilities - Design and maintain logical and physical data models using Erwin. - Collaborate with business analysts and stakeholders to gather data requirements and translate business processes into comprehensive data models. - Ensure data integrity, quality, and security in all modeling activities and implement best practices for data governance and management. - Develop and update metadata associated with data models and provide technical support for database design and implementation. - Conduct data profiling and analysis to define requirements, create data flow diagrams and entity-relationship diagrams, and review and refine data models with stakeholders and development teams. - Perform impact analysis for changes in the modeling structure, train and mentor junior data modeling staff, and ensure compliance with data standards and regulations. - Collaborate with ETL developers to optimize data extraction processes and document modeling processes, methodologies, and standards for reference. Required Qualifications - Bachelors degree in Computer Science, Information Technology, or a related field. - Minimum of 3 years of experience as a data modeler or in a related role with proven expertise in using Erwin for data modeling. - Strong knowledge of relational databases and SQL, experience in data architecture and database design principles, and familiarity with data warehousing concepts and practices. - Ability to analyze complex data structures, recommend improvements, understand data governance frameworks and best practices, and possess excellent analytical and problem-solving skills. - Strong communication and documentation skills, ability to work collaboratively in a team-oriented environment, experience with data integration and ETL processes, and ability to manage multiple projects and deadlines effectively. - Familiarity with data visualization and reporting tools is a plus, willingness to keep skills updated with ongoing training and learning, and certification in Data Modeling or equivalent is desirable. Skills: entity-relationship diagrams, data modeling, documentation skills, database design principles, ETL processes, SQL proficiency, data integration, data architecture, DAX, database design, data governance, data security, SQL, Power Query, data governance frameworks, relational databases, analytical skills, problem-solving, data quality, communication skills, data warehousing, analytical thinking, data flow diagrams, team collaboration, Erwin, data profiling,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
kochi, kerala
On-site
As an ORMB Technical Architect, you will be responsible for understanding the ORMB technical architecture and framework methods. You must possess a strong understanding of the underlying infrastructure, including Web Application Server, Business Application Server, and Database Server. Your role will involve designing and supporting real-time and batch programs to meet business requirements effectively. You will be expected to have in-depth knowledge of deployment models (On-Prem, SaaS) and integration strategies with third-party applications. Your expertise should extend to Oracle Utilities Application Framework (OUAF) and integrating technologies such as Java, REST, XML/JSON, Oracle Database, Groovy, and plugin-driven batches. In this role, your ability to write complex SQL queries within the ORMB data model is crucial for retrieving, manipulating, and managing data efficiently. You should also have strong expertise in ORMB configuration, extension, enhancement, and reporting capabilities, including customizations tailored to project needs. Your responsibilities will include understanding the ORMB table structure, developing interfaces, and managing data migration effectively. Excellent communication skills are essential for client-facing roles, where you will address concerns, provide updates, and manage expectations professionally. You will be required to prepare configuration design documents and develop ORMB configurations, including Service Scripts, Plug-in Scripts, BPA Scripts, UI Maps, Zones, and Portals. Collaboration with QA teams to develop test scripts and execute end-to-end testing of customizations and enhancements is also part of the role. Experience in providing ongoing functional and technical production support during and after the go-live phase is necessary. Your expertise in ORMB configurations and enhancements will contribute significantly to the success of projects in this role.,
Posted 1 week ago
3.0 - 6.0 years
3 - 8 Lacs
bengaluru, karnataka, india
On-site
We are seeking a skilled Business Analyst with expertise in Cloud technologies and Financial Operations (FinOps) to join our team in India. The ideal candidate will play a critical role in analyzing business requirements and implementing cloud solutions that optimize financial performance. Responsibilities Gather and analyze business requirements from stakeholders to understand their needs. Develop and document business processes and workflows in the context of cloud and FinOps. Collaborate with cross-functional teams to design and implement cloud-based solutions. Perform data analysis to support financial operations and optimize costs in cloud environments. Create and maintain reports and dashboards to track key performance indicators (KPIs) related to FinOps. Assist in the development of financial models and forecasts for cloud expenditures. Identify opportunities for process improvements and automation in business operations. Skills and Qualifications Bachelor's degree in Business Administration, Finance, Information Technology, or a related field. 3-5 years of experience as a Business Analyst or in a related role, preferably with a focus on Cloud and FinOps. Strong understanding of cloud computing concepts, models, and services (e.g., AWS, Azure, Google Cloud). Knowledge of financial operations and budgeting principles in a cloud context. Proficient in data analysis tools and techniques, including Excel, SQL, and visualization tools like Power BI or Tableau. Excellent communication and interpersonal skills to interact effectively with stakeholders at all levels. Strong problem-solving skills and attention to detail. Experience with Agile methodologies and project management tools is a plus.
Posted 1 week ago
9.0 - 14.0 years
1 - 20 Lacs
hyderabad, telangana, india
On-site
Description We are seeking a highly skilled Director of Commercial Forecasting and Analytics to join our team in India. This role will be responsible for leading the development of robust forecasting models and analytical frameworks that drive strategic business decisions. Responsibilities Lead the development of commercial forecasting models and analytical frameworks to inform business strategy. Collaborate with cross-functional teams to gather data and insights for accurate forecasting. Analyze market trends and competitive landscape to support decision-making processes. Present forecasting and analytics results to senior management and stakeholders in a clear and concise manner. Continuously improve forecasting methodologies and tools to enhance accuracy and efficiency. Manage a team of analysts, providing guidance and mentorship to drive high performance. Skills and Qualifications Bachelor's degree in Business, Economics, Statistics, or a related field; MBA preferred. 9-14 years of experience in commercial forecasting, analytics, or a related field. Proficiency in data analysis tools such as Excel, SQL, R, or Python. Strong understanding of statistical methods and forecasting techniques. Experience with data visualization tools (e.g., Tableau, Power BI) to communicate insights effectively. Excellent analytical, problem-solving, and critical-thinking skills. Strong project management skills and ability to handle multiple priorities. Excellent communication and presentation skills, with the ability to convey complex information to non-technical stakeholders.
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
hyderabad, telangana
On-site
As a Salesforce Technical Architect at our organization, you will play a pivotal role in designing and implementing cutting-edge solutions for our healthcare clients. Your expertise will be instrumental in creating scalable and high-performance architectures that redefine the possibilities within the Salesforce ecosystem. You will be responsible for leading our digital transformation journey by leveraging your deep understanding of Salesforce and taking ownership of the architecture and design space. To excel in this role, you must hold a minimum of two Salesforce Architect Designer certifications, in addition to two other certifications such as Developer, Administrator, Sales, and Service Cloud. Possessing certifications in Experience Cloud and Einstein would be highly advantageous. Your background should include over 3 years of experience as an enterprise/platform architect and a total of 10+ years in the industry, with a proven track record of architecting global-scale enterprise applications. Proficiency in Lightning Web Components, Lightning Data Service, and modern JavaScript frameworks is essential for this role. Your hands-on experience with these technologies will be crucial in bringing our solutions to life. You should also have a strong grasp of advanced design patterns in Salesforce (Apex), .NET, Java, or similar platforms, coupled with expert-level skills in relational databases and SQL. Your commitment to creating clean, scalable architectures that enhance customer experience will set you apart. You should possess a deep understanding of various platform features, trigger architecture, and scalable integration patterns. Your passion, drive, and go-getter attitude will be key in inspiring excellence within the team. In this role, you will lead technical design sessions to translate complex healthcare challenges into innovative Salesforce solutions. Mentoring and upskilling the development team to uphold technical excellence will be a core responsibility. Collaborating with business analysts to architect system solutions that drive client success and devising strategies to address complex technical challenges in the healthcare IT landscape will be part of your impact. By championing best practices in Salesforce development, you will establish new benchmarks for quality and efficiency.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
The PL/SQL Developer plays a critical role in supporting the development, enhancement, and maintenance of database applications within our organization. This position is vital for ensuring the integrity of database systems and systems" performance through effective programming and optimization of PL/SQL code. The ideal candidate will leverage their expertise in relational database technologies to design and implement complex SQL queries, stored procedures, and triggers, thereby enabling efficient data retrieval and manipulation. By closely collaborating with software engineers and data analysts, the PL/SQL Developer facilitates seamless integration of database functionalities with application workflows. Contributing to the overall data strategy, this role is not only centered on coding but also involves identifying and resolving performance issues. In a dynamically changing technological environment, the PL/SQL Developer must maintain knowledge of industry best practices, continuously improving their skills to deliver robust database solutions. Design and develop PL/SQL scripts for data manipulation and retrieval. Write efficient and optimized SQL queries to enhance performance. Develop stored procedures, functions, and triggers to automate processes. Conduct thorough debugging and troubleshooting of PL/SQL code. Implement database performance tuning strategies. Collaborate with application developers to integrate database solutions. Maintain documentation of database structures, code changes, and updates. Conduct code reviews and provide constructive feedback to peers. Support data migration and data cleansing activities. Work closely with business stakeholders to understand data requirements. Monitor database performance and implement improvements as needed. Enhance existing PL/SQL applications for improved efficiency. Stay updated with new database technologies and best practices. Participate in disaster recovery and data backup procedures. Ensure compliance with data governance policies and practices. Bachelor's degree in Computer Science, Information Technology, or related field. Minimum of 3 years of experience in PL/SQL development. Strong knowledge of Oracle databases and PL/SQL programming. Proficient in SQL and database design principles. Experience with performance tuning and optimization techniques. Familiarity with database management tools and software. Ability to write complex queries and stored procedures. Knowledge of data modeling concepts and best practices. Experience in working within an Agile development environment. Strong analytical and problem-solving skills. Excellent communication and team collaboration abilities. Experience with version control systems (e.g., Git, SVN). Proven ability to deliver projects within deadlines. Knowledge of additional programming languages (e.g., Java, Python) is a plus. Experience with ETL processes or data warehousing solutions is a benefit.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
The ETL Developer - ODI Developer plays a critical role in the data management ecosystem of the organization. You will focus on the extraction, transformation, and loading of data using Oracle Data Integrator (ODI), ensuring accurate and timely data availability across various systems. Collaborating closely with data analysts and business stakeholders, you will design robust data flows and optimize existing ETL processes. Your deep understanding of data warehousing concepts and proficiency with SQL and ODI tools are essential for maintaining the integrity, efficiency, and performance of our data systems. Your contributions will empower teams with reliable data insights for strategic decision-making and support data-driven initiatives. Key Responsibilities - Design and develop ETL processes using Oracle Data Integrator. - Extract data from various sources such as SAP, CSV files, and databases. - Transform data based on business rules and requirements. - Load data into target data warehouses and data marts. - Conduct data profiling to ensure data quality and consistency. - Optimize ETL performance and data processing efficiency. - Debug and troubleshoot ETL workflows and data integration issues. - Collaborate with BI analysts to comprehend data requirements. - Implement best practices for ETL development and maintenance. - Document ETL processes and workflows for future reference. - Perform regular maintenance and upgrades on existing data pipelines. - Ensure compliance with data governance and security policies. - Participate in code reviews and promote team knowledge sharing. - Monitor ETL jobs for performance and reliability. - Assist in user training and support related to ETL tools. Required Qualifications - Bachelor's degree in Computer Science, Information Technology, or a related field. - 3+ years of experience in ETL development using Oracle Data Integrator. - Strong knowledge of SQL and database technologies. - Experience with data modeling and data warehousing concepts. - Familiarity with various data source integrations. - Proficiency in performance tuning of ETL processes. - Experience with debugging and performance optimization. - Knowledge of data governance and data quality best practices. - Ability to work collaboratively in a team environment. - Strong analytical and problem-solving skills. - Excellent communication skills, both written and verbal. - Experience with source control tools like Git. - Familiarity with deployment automation tools is a plus. - Certifications in Oracle Data Integrator or similar tools are advantageous. - Ability to manage multiple tasks and deadlines effectively. - Commitment to continuous learning and professional development.,
Posted 2 weeks ago
0.0 - 4.0 years
0 Lacs
haryana
On-site
The Junior Data Analyst plays a vital role within the organization, responsible for converting data into actionable insights that facilitate decision-making. This entry-level position caters to individuals passionate about data analysis, aiming to enhance their analytical capabilities while contributing to the company's overall success. Working closely with senior analysts and team members, the Junior Data Analyst collaborates in data collection from diverse sources, trend analysis, and report generation. By utilizing statistical techniques and data visualization tools, the analyst identifies patterns and opportunities for improvement. The significance of this role lies in its ability to influence strategic initiatives, boost operational efficiency, and propel business growth through well-informed decision-making. Ideal for those embarking on a career in data analysis, applicants should possess a solid foundation in analytical skills and statistical methodologies. Key Responsibilities: - Assisting in data collection and cleansing from various sources. - Performing statistical analysis to detect trends and patterns. - Supporting the creation of dashboards and data visualization reports. - Collaborating with team members to delineate data requirements. - Presenting findings to stakeholders and management. - Utilizing Excel, SQL, and other tools for data manipulation. - Conducting regular data quality checks for accuracy assurance. - Generating ad-hoc reports as per management requests. - Documenting data processes and methodologies for future reference. - Extracting meaningful insights from large datasets. - Providing data analyses to support research projects. - Ensuring data integrity across different systems. - Identifying process improvement areas based on data insights. - Staying abreast of industry trends and best practices in data analysis. - Participating in team meetings to discuss project statuses and findings. Required Qualifications: - Bachelor's degree in Data Science, Statistics, Mathematics, or a related field. - Proficiency in SQL and database management. - Strong skills in Microsoft Excel and data visualization tools. - Familiarity with statistical software (e.g., R, Python, or SAS). - Sharp analytical skills with keen attention to detail. - Fundamental grasp of data analysis techniques and methodologies. - Ability to effectively communicate complex data to non-technical stakeholders. - Preferably, internship or project experience in data analytics. - Proactive problem-solving approach with critical thinking abilities. - Eagerness to learn and adapt in a dynamic environment. - Capability to work independently and collaboratively within a team. - Strong organizational and time management capabilities. - Knowledge of data cleaning and preprocessing techniques. - Experience with reporting and data visualization tools (e.g., Tableau, Power BI) is advantageous. - Understanding of business intelligence concepts. Skills: critical thinking, problem-solving, Python, SQL proficiency, reporting tools, data cleaning, data visualization, Power BI, data visualization tools, SAS, statistical analysis, Microsoft Excel, Tableau, R, business intelligence, SQL, communication skills, statistical software,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
One of the fastest growing fintech companies has developed a full-stack financial platform for Bharat 2.0, offering multiple financial products such as Lending, Insurance, and Investments. With an AUM exceeding INR 1000 cr and a workforce of approximately 500 employees, they are at the forefront of the industry. As a Credit Risk Modeling Manager, your primary responsibilities will involve developing and implementing advanced credit risk models using statistical methodologies. You will be tasked with analyzing large datasets to discern trends, patterns, and anomalies associated with credit risk. Collaboration with cross-functional teams to gather requirements and ensure alignment with business goals is essential. Additionally, you will be responsible for monitoring and evaluating the performance of existing credit risk models, making necessary adjustments, and communicating findings and recommendations clearly to stakeholders. In this role, you will lead a team of data scientists, providing mentorship and guidance to facilitate their professional growth. Designing and executing experiments to validate model assumptions, improve accuracy, and ensure compliance with industry regulations and internal policies related to credit risk assessment will be part of your core responsibilities. Utilizing data visualization techniques to present complex data insights to non-technical stakeholders is crucial. You will also stay abreast of emerging trends and tools in data science and credit risk modeling, conduct risk assessments and stress testing, and collaborate with IT and data engineering teams to ensure data availability and integrity. Your qualifications should include a Bachelor's degree in statistics, mathematics, computer science, or a related field, along with a minimum of 5 years of experience in credit risk modeling or data science. A minimum of 2 years of team management experience is mandatory. Proficiency in statistical modeling techniques, machine learning algorithms, Python programming, SQL, and relevant libraries (e.g., Pandas, Scikit-learn) is required. Knowledge of credit risk regulations, industry standards, data visualization tools (e.g., Tableau, Power BI), and the ability to effectively communicate complex concepts to diverse audiences are essential. Successful candidates will have a proven track record of leading and mentoring junior team members, strong analytical and problem-solving skills, experience with data preprocessing and feature engineering, and the ability to work collaboratively in a team-oriented environment. Excellent time management skills, attention to detail, a strong understanding of business drivers and implications of credit risk, and a willingness to continuously learn and adapt to new methodologies and technologies are also key attributes for this role. Key Skills: risk management, team leadership, Tableau, predictive analytics, credit risk modeling, Power BI, Python, SQL proficiency, problem-solving, data preprocessing, statistical modeling, Scikit-learn, machine learning, data science, credit risk, feature engineering, machine learning algorithms, data visualization, Pandas, SQL, statistical modeling techniques.,
Posted 2 weeks ago
3.0 - 6.0 years
3 - 6 Lacs
lucknow, uttar pradesh, india
On-site
Requirements Gathering & Analysis: Collaborate with clients to understand their specific business needs and challenges. Analyze existing supply chain processes and identify opportunities for improvement using Kinaxis RapiResponse. Kinaxis Configuration & Customization: Design, build, and configure Kinaxis RapiResponse applications tailored to business requirements. Develop and implement data models, integrations, and custom scripts within the Kinaxis platform. Ensure accurate and efficient data flow across the system. Project Delivery & Support: Work closely with cross-functional teams, including internal and client stakeholders, to ensure successful implementation and delivery of solutions. Provide technical support and troubleshooting for deployed Kinaxis solutions. Maintain and update technical documentation related to projects. Continuous Improvement: Stay current on the latest developments in supply chain planning and Kinaxis RapiResponse features. Proactively identify and implement enhancements to optimize solutions and improve client satisfaction. Qualifications: Education: Bachelor's or Master's degree in Computer Science, Engineering, Mathematics, Statistics, or related field. Experience: Minimum 3+ years of hands-on experience with Kinaxis RapiResponse or similar supply chain planning software. Technical Skills: Kinaxis RapiResponse Expertise: Experience configuring Master Planning, Demand Planning, Supply Planning, Inventory Optimization, and Sales & Operations Planning (S&OP). Advanced scripting skills (e.g., RapidTables) for data manipulation, workflow automation, and custom calculations. Integration experience with ERP, CRM, WMS, or other enterprise systems via APIs, ETL tools, and middleware. Proficiency in data modeling within Kinaxis, ensuring data accuracy and integrity. Ability to create and maintain reports and dashboards, including integration with BI platforms. Data & Analytics: Strong SQL skills for ETL processes. Experience with data analysis and visualization tools such as Excel, Tableau, and Power BI. Familiarity with data warehousing and data lake concepts is a plus. Cloud Technologies: Familiarity with cloud platforms such as AWS, Azure, or GCP, particularly for supply chain solutions. Project Management: Experience working in Agile environments using Scrum or Kanban methodologies. Proficient with project management tools like Jira or Asana.
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
You will be part of a dynamic and innovative company specializing in data-driven solutions with a mission to empower businesses with actionable insights derived from their data. Collaboration, creativity, and a commitment to excellence are highly valued, fostering a culture that encourages continuous learning and growth. Providing top-notch services to clients while maintaining a supportive and inclusive work environment is a key focus. Your responsibilities will include developing and implementing data models to extract insights and support business decisions, utilizing Snowflake for managing and optimizing data storage and retrieval processes, analyzing large datasets to identify trends and anomalies, collaborating with cross-functional teams to understand data-related needs, creating data visualizations to effectively communicate findings, conducting statistical analysis and predictive modeling using Python, designing and executing ETL processes for data quality and integrity, maintaining documentation of data workflows and analyses, providing mentorship to junior data team members, staying current with industry trends and best practices in data science, identifying opportunities for process improvements in data management, assisting in creating reports for stakeholders based on data analysis, implementing machine learning algorithms for data prediction accuracy, working with data engineers to build and maintain data pipelines, and ensuring compliance with data governance and security policies. Qualifications for this role include a Bachelor's degree in Data Science, Statistics, Computer Science, or a related field, with 5+ years of experience in data science or related roles. Expertise in Snowflake and SQL for data manipulation, experience with Python for data analysis and modeling, hands-on experience with machine learning techniques, knowledge of data visualization tools (e.g., Tableau, Power BI), familiarity with ETL tools and processes, strong analytical and problem-solving skills, excellent communication skills, ability to work collaboratively in a team-oriented environment, knowledge of statistical methods and tools, experience with cloud data warehousing solutions, ability to manage multiple projects and deadlines effectively, self-motivated with a passion for continuous learning, and understanding of data governance and compliance issues are required. Skills required for this role include SQL, ETL, data visualization, statistical analysis, Python, Snowflake, cloud data warehousing, machine learning, data analysis, SQL proficiency, statistical modeling, data governance, and compliance.,
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
pune, maharashtra
On-site
As a Talend ETL Developer at Team Geek Solutions, you will be responsible for designing, developing, and maintaining ETL processes using Talend. Your role will involve implementing data integration solutions, collaborating with business stakeholders to understand data requirements, and optimizing SQL queries for data extraction and manipulation. You will be tasked with ensuring data accuracy and quality through data profiling and analysis, as well as monitoring and troubleshooting ETL jobs to ensure smooth data flow. Additionally, you will be required to maintain documentation for ETL processes and data model designs, work with team members to design and enhance data warehouses, and develop data transformation logic to meet business needs. To excel in this role, you must hold a Bachelor's degree in Computer Science, Information Technology, or a related field, and have proven experience as an ETL Developer with expertise in Talend. Your strong understanding of ETL frameworks, data integration principles, and proficiency in writing and troubleshooting SQL queries will be essential. Experience in data modeling, database design, and familiarity with data quality assessment methodologies are also required. Your ability to analyze complex data sets, provide actionable insights, and demonstrate strong problem-solving and analytical skills will be crucial. Excellent communication and interpersonal skills are necessary for collaborating effectively in a team-oriented environment. Knowledge of data warehousing concepts, best practices, and experience with Agile development methodologies will be valuable assets. Your willingness to learn new technologies and methodologies, attention to detail, commitment to delivering high-quality solutions, and ability to manage multiple tasks and deadlines effectively are key attributes for success in this role. Experience with performance tuning and optimization of ETL jobs is a plus. If you are passionate about data warehousing, troubleshooting, ETL processes, workflow management, data modeling, SQL, data profiling and analysis, data governance, data integration, and Agile methodology, and possess the required skills and qualifications, we invite you to join our innovative and collaborative team at Team Geek Solutions in Mumbai or Pune.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
The Guidewire Policy Center position is a crucial role within the organization, tailored for individuals well-versed in insurance policy administration and Guidewire software. Your primary responsibility will be to optimize policy lifecycle management and ensure the efficiency and effectiveness of operational processes utilizing the Guidewire suite. Working closely with departments like underwriting, claims, and IT, you will contribute to enhancing and implementing systems that elevate our services. Your expertise in insurance technology and project management will be pivotal in driving digital transformation initiatives and enhancing our operational capabilities to provide an exceptional client experience. You will be tasked with implementing and configuring Guidewire Policy Center to align with the organization's business requirements. Collaborating with cross-functional teams, you will analyze, design, and deliver system enhancements. Your role will also involve conducting system testing and validation to meet functionality and performance objectives. Additionally, you will assist in developing and managing project plans, ensuring the timely delivery of milestones. Providing ongoing support and troubleshooting for system users will be essential in enhancing user experience and efficiency. Participating in requirements gathering and documentation to align technology solutions with business needs will be a key aspect of your role. You will maintain and update system configurations in compliance with regulatory standards and organizational requirements. Facilitating user training and creating documentation for policy administration processes will also be part of your responsibilities. Collaborating closely with the IT team to ensure seamless integration between Policy Center and other systems is crucial. Monitoring system performance, optimizing processes for enhanced operational efficiency, and engaging in project retrospectives and continuous improvement initiatives are also important tasks. Staying updated on the latest Guidewire features and industry trends to provide informed recommendations will be expected. Coordinating with external vendors for support and system enhancements, ensuring data integrity and security measures are maintained across all policy systems are also part of the role. Required Qualifications: - Bachelor's degree in Computer Science, Information Technology, or a related field. - 3+ years of experience in implementing and configuring Guidewire Policy Center. - Strong background in insurance operations or insurance technology. - Proficiency in Java, SQL, and Guidewire application programming. - Experience with Agile software development methodologies and project management. - Familiarity with systems integration and API management. - Ability to analyze complex requirements and translate them into technical specifications. - Excellent problem-solving skills and attention to detail. - Strong communication and interpersonal skills for effective client interaction. - Experience in conducting user training and developing training materials. - Knowledge of the regulatory landscape affecting insurance policy administration. - Certification in Guidewire products is a plus. - Adept at collaborating within a team-oriented environment. - Ability to manage multiple priorities and projects concurrently. - Commitment to continuous learning and professional growth. - Strong analytical skills with a focus on data-driven decision-making.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
The Actimize Developer is responsible for designing, developing, and implementing Actimize applications to meet business requirements. You will collaborate with business analysts to translate project requirements into technical specifications. Your role will involve participating in the full software development lifecycle, including analysis, design, implementation, testing, and deployment to ensure the performance, quality, and responsiveness of the Actimize applications. Conducting code reviews, providing feedback to team members, optimizing existing code, and troubleshooting applications will be key responsibilities. Integrating Actimize applications with other enterprise systems and databases, maintaining documentation of application design and deployment procedures, and working with stakeholders to gather system requirements are essential tasks. Adhering to coding standards, participating in Agile ceremonies, and staying updated on industry trends related to Actimize and financial compliance are also part of your role. Additionally, you will contribute to training and mentoring junior developers, create test cases, assist in user acceptance testing, and collaborate with the QA team to ensure thorough testing before production deployment. The successful Actimize Developer will have a Bachelor's degree in Computer Science, Information Technology, or a related field, along with a minimum of 3 years of experience in Actimize development or similar roles. You should possess strong knowledge of the Java programming language, SQL, and database management systems such as Oracle and SQL Server. Experience with Actimize products, regulatory frameworks, compliance requirements, application integration, web services, Agile methodologies, and version control systems like Git is required. Strong analytical, problem-solving, communication, collaboration, and time management skills are essential. A detail-oriented approach, willingness to learn new technologies, and experience in the financial services industry are considered advantageous.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As a member of the team at Viraaj HR Solutions, you will play a crucial role in designing, developing, and maintaining ETL processes using Talend Open Studio. Your responsibilities will include collaborating with cross-functional teams to gather requirements, executing data migration and transformation processes efficiently, and developing test cases for data integration. Monitoring and improving data quality metrics to ensure accuracy, managing troubleshooting and debugging of ETL processes, and documenting technical specifications will be essential aspects of your role. You will be expected to implement best practices in data management and analytics, assist in the extraction, transformation, and loading of large datasets, and ensure compliance with data governance and protection policies. Providing support for production issues, conducting performance tuning and optimization of ETL processes, and collaborating with the data warehouse team to ensure optimal data architecture will also be part of your responsibilities. Your qualifications should include a Bachelor's degree in Computer Science, Information Technology, or a related field, along with proven experience as a Talend Open Studio Developer or in a similar ETL development role. Strong understanding of data integration and ETL best practices, proficiency in SQL and database management systems, and experience with data quality tools and methodologies are required. Familiarity with data visualization tools, excellent problem-solving and analytical skills, and the ability to work in a team-oriented environment are also important. Effective communication skills, both written and verbal, knowledge of agile methodologies and project management tools, and experience with cloud technologies and environments will be beneficial. Strong attention to detail, commitment to delivering high-quality work, ability to manage multiple tasks and meet deadlines, understanding of data governance and compliance regulations, and willingness to continuously learn and adapt to new technologies are necessary for success in this role. If you are passionate about driving success through innovative solutions, staying updated on industry trends and advancements in data integration technologies, and contributing to continuous improvement initiatives, we invite you to join our dynamic team at Viraaj HR Solutions.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
kochi, kerala
On-site
You will be responsible for designing and implementing data models in Snowflake for optimized storage and retrieval. Your role will also involve developing ETL processes to ensure robust data integration pipelines and collaborating with data analysts and business stakeholders to gather requirements for data solutions. Additionally, you will be required to perform data profiling to understand data quality, cleanse data as necessary, and optimize query performance using Snowflake-specific features. Monitoring data warehouse performance, troubleshooting data-related issues, and ensuring data security and compliance with company policies and regulations will be part of your responsibilities. Furthermore, you will need to create and maintain documentation for data processes and workflows, develop automated data workflows to reduce manual processing, and participate in architecture discussions contributing to the design of scalable data solutions. Integrating Snowflake with other cloud services for enhanced data functionality, performing regular data backups and disaster recovery exercises, and staying updated with the latest features and enhancements in Snowflake are essential aspects of the role. You will also be expected to train junior engineers and advocate for best practices in data engineering while contributing to the continuous improvement of data engineering processes. To qualify for this position, you must hold a Bachelor's degree in Computer Science, Information Technology, or a related field and have a minimum of 3 years of experience as a Data Engineer or in a similar role. Strong proficiency in Snowflake, hands-on experience in designing data models, expertise in SQL, experience with querying large datasets, and familiarity with ETL tools and data integration practices are required. Additionally, knowledge of cloud services such as AWS, Azure, or GCP, data warehousing concepts, and best practices, proficiency in performance tuning and optimization techniques, excellent problem-solving and analytical skills, and strong communication skills are essential. You should also have the ability to work collaboratively in a team environment, familiarity with data visualization tools (a plus), experience in Agile methodologies (advantageous), and certifications in Snowflake or data engineering (preferred). A proactive approach to learning and implementing new technologies is expected. This position offers a unique opportunity to work on exciting data projects at Viraaj HR Solutions, collaborating with talented professionals in a vibrant and innovative environment.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
noida, uttar pradesh
On-site
You will play a crucial role in supporting and enhancing the utilization of our Sales Compensation solution by developing, configuring, and maintaining the Xactly Incent, Connect, and Extend platform to align with the organization's evolving needs. This involves the setup of new plans, credit rules, formulas, connect pipelines, workflows, as well as designing, developing, and testing reports and dashboards. Your responsibilities will include collaborating with the Xactly support team to implement and deploy changes across environments, ensuring seamless integration between Xactly Connect and other systems like Salesforce, creating user-friendly documentation and resources, and customizing Xactly Connect based on business requirements with input from stakeholders. To excel in this role, you should have at least 3 years of experience with Xactly Incent and Connect, familiarity with incentive compensation processes, SQL proficiency, knowledge of native and delta schemas in Connect, expertise in data analysis and reporting tools, and excellent communication skills. Additionally, you should be adept at managing permissions, access, personalization, and other system operations for Xactly users. Ideally, you will hold a Bachelor's degree or equivalent experience, have proven experience as a system administrator focusing on incentive compensation platforms, and possess a deep understanding of Xactly platform functionalities. Your ability to work collaboratively with cross-functional teams and apply best practices in maintaining and migrating Connect assets will be key to your success in this role.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
delhi
On-site
You will be joining Viraaj HR Solutions, a dynamic and innovative HR consultancy that is committed to facilitating connections between talented professionals and reputable organizations. The company's mission is to empower businesses by providing them with the right talent to drive success and foster growth. At Viraaj HR Solutions, values such as integrity, teamwork, and excellence are highly regarded, shaping a culture that promotes creativity, professional development, and work-life balance. Your role as a Snowflake Developer will involve designing, developing, and implementing data solutions using the Snowflake platform. Your responsibilities will also include optimizing Snowflake databases and schemas for efficient data access, developing data integration workflows utilizing ETL tools, and maintaining data models to support business analytics. Additionally, you will collaborate with data scientists and analysts to address data-related issues, monitor Snowflake usage, and ensure data quality and integrity across all data stores. Documenting data architecture and design processes, conducting system performance tuning, and troubleshooting the Snowflake environment will also be part of your duties. Furthermore, you will integrate Snowflake with cloud-based services and tools as needed, participate in code reviews, stay updated on Snowflake features and best practices, and provide training to team members on Snowflake capabilities and tools. Working closely with stakeholders to gather requirements and define project scope is essential for success in this role. To qualify for this position, you should hold a Bachelor's degree in Computer Science, Information Technology, or a related field and have a minimum of 3 years of experience in data engineering or database development. Proficiency in the Snowflake platform, SQL, ETL tools, and data warehousing concepts is required, with certification in Snowflake being advantageous. You should also possess strong analytical, problem-solving, communication, and team collaboration skills, as well as experience in data modeling, metadata management, and troubleshooting data-related issues. Familiarity with cloud platforms such as AWS, Azure, or Google Cloud, as well as knowledge of data governance practices, is desirable. Additionally, the ability to work independently, manage multiple tasks effectively, and adapt to new technologies will be crucial for excelling in this role. Your expertise in cloud computing, data modeling, SQL proficiency, analytical skills, data integration, cloud platforms (AWS, Azure, Google Cloud), data governance, problem-solving, Snowflake, metadata management, troubleshooting, communication, team collaboration, data warehousing, and performance tuning will be invaluable in fulfilling the responsibilities of the Snowflake Developer role at Viraaj HR Solutions.,
Posted 2 weeks ago
0.0 years
1 - 1 Lacs
bhubaneswar, odisha, india
Remote
Description We are seeking freshers/entry-level candidates for the GIS position in our team in India. This role offers an exciting opportunity to work with geographic information systems and contribute to various projects that utilize spatial data for effective decision-making. Responsibilities Assist in the collection, analysis, and interpretation of geographic data. Support the development and maintenance of GIS databases and applications. Prepare detailed maps, reports, and presentations for various stakeholders. Conduct spatial analysis and modeling to support decision-making processes. Collaborate with team members on various GIS projects and initiatives. Stay updated with the latest GIS technologies and methodologies. Skills and Qualifications Proficiency in GIS software such as ArcGIS, QGIS, or similar. Strong analytical and problem-solving skills. Familiarity with spatial data formats and databases (e.g., shapefiles, GeoJSON). Basic knowledge of remote sensing and cartography principles. Ability to work collaboratively in a team environment. Strong communication skills, both written and verbal.
Posted 4 weeks ago
10.0 - 15.0 years
0 Lacs
maharashtra
On-site
You will be responsible for leading the architectural design and modernization of defined components in the software engineering domain. Collaborating with product owners in the business, you will design and build solutions for IBD and GCM. Regular communication with product leads across the technology organization to identify opportunities for improving existing and future technology solutions will be essential. As a Technical Lead, you will be expected to act as a hands-on engineer, actively addressing the most challenging problems. Additionally, providing technical mentorship and leadership to a squad of developers will be a key part of your role. Your expertise in Java EE, Microservices, Web service development, REST, Services Oriented Architecture, Object-Oriented Design, Design patterns, Architecture, Application Integration, Databases, SpringBoot, Junit, BDDUnix/Linux will be crucial for success in this position. Experience with Web UI JS Framework such as AngularJS, NoSQL like MongoDB, and managing data through vendor feeds will be considered advantageous. If you are looking to lead and drive technological innovation while mentoring a team of developers, this role offers a rewarding opportunity to make a significant impact. For further details on this exciting opportunity, please reach out to 70454 59739 or email kajal@mmcindia.biz.,
Posted 1 month ago
6.0 - 9.0 years
6 - 9 Lacs
Delhi, India
On-site
Your Role: Lead OTM systems (IS) expertise across projects and operations, aligning with regional direction and working with internal and external stakeholders to ensure effective system deployment and support. Your Responsibilities: Ensure deployment of only approved IS applications such as Transport Management Systems (TMS), visibility tools, and reports. Coordinate IS input for tender/proposal requests and ensure smooth TMS implementation during customer onboarding. Maintain ongoing TMS application support for existing operations. Support continuous improvement initiatives and innovation in logistics systems. Enforce TMS governance and compliance practices. Upskill team members through structured training and development activities. Your Skills and Experiences: Minimum 6+ years of implementation and configuration experience in Oracle Transport Management (OTM) with both technical and functional expertise, especially in the distribution industry. Strong functional and techno-functional experience with OTM or Global Transportation Management implementations. Familiarity with OTM Release 6.5.X and above , including OTM Cloud . In-depth knowledge of key OTM application modules : order management, shipment management, OTM finance, automation agents, and interfaces. Ability to prepare mapping documents to interface OTM with EDI, WMS, Order Management, and Finance systems . Capable of translating operational requirements into technical design specifications for offshore delivery. Proficiency in SQL for automation agents and technical configurations. Experience with JSPX/XSL (preferred). Strong understanding of the end-to-end OTM lifecycle , including system architecture and implementations.
Posted 1 month ago
5.0 - 9.0 years
2 - 10 Lacs
Hyderabad, Telangana, India
On-site
Roles & Responsibilities: Collaborate with Product Teams and System Architects to understand business strategy, needs, and problems. Convert Epics into Features and granular User Stories with clear Acceptance Criteria and Definition of Done. Translate user stories into functional Anaplan model designs, ensuring alignment with best practices and Amgen architectural standards. Develop and maintain Anaplan modules, dashboards, and integrations. Create and validate proof-of-concepts (POCs) to test assumptions, validate solutions, or propose new features. Maintain up-to-date documentation of Anaplan model architecture, business logic, data integrations, and process configurations. Produce end-user guides, functional specs, and technical documentation to support user enablement and organisational change. Conduct impactful demos of Anaplan features internally to Product Teams and partners. Find opportunities to improve existing Anaplan models and processes. Stay current with Anaplan releases, features, and community standard processes; proactively recommend enhancements. Support the scaling of Anaplan across business units through templated solutions and reusable components. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Masters degree/Bachelor s degree with 5- 9 years of experience in Computer Science, IT or related field. Functional Skills:Must-Have Skills: Programming experience in at least one modern language (e. g. , Python, JavaScript, R, etc. ) for scripting, data transformation, or integration. Excellent problem-solving skills and a passion for tackling complex challenges with technology Experience with writing user requirements and acceptance criteria in agile project management systems such as JIRA Good-to-Have Skills: Experience in managing product features for PI planning and developing product roadmaps and user journeys Familiarity with low-code, no-code test automation software Able to communicate technical or complex subject matters in business terms Experience in Agile/Scrum and DevOps environments. Professional Certifications: Anaplan Certified Model Builder (incl. L1 and L2 MB) (required) Cloud certifications (AWS Certified Solutions Architect, DevOps Engineer, etc. ) (preferred) Databricks certifications (Data Engineer Professional) (preferred) Soft Skills: Able to work under minimal supervision Excellent analytical and gap/fit assessment skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |