Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
6.0 - 12.0 years
11 - 15 Lacs
Pune
Work from Office
As a Data Architect, you'll design and optimize data architecture to ensure data is accurate, secure, and accessible. you'll collaborate across teams to shape the data strategy, implement governance, and promote best practices enabling the business to gain insights, innovate, and make data-driven decisions at scale. Your responsibilites Responsible for defining the enterprise data architecture which streamlines, standardises, and enhances accessibility of organisational data. Elicits data requirements from senior Business stakeholders and the broader IS function, translating their needs into conceptual, logical, and physical data models. Oversees the effective integration of data from various sources, ensuring data quality and consistency. Monitors and optimises data performance, collaborating with Data Integration and Product teams to deliver changes that improve data performance. Supports the Business, Data Integration Platforms team and wider IS management to define a data governance framework that sets out how data will be governed, accessed, and secured across the organisation; supports the operation of the data governance model as a subject matter advisor. Provides advisory to Data Platform teams in defining the Data Platform architecture, providing advisory on metadata, data integration, business intelligence, and data storage needs. Supports the Data Integration Platforms team, and other senior IS stakeholders to define a data vision and strategy, setting out how the organisation will exploit its data for maximum Business value. Builds and maintains a repository of data architecture artefacts (eg, data dictionary). What we're Looking For Proven track record in defining enterprise data architectures, data models, and database/data warehouse solutions. Evidenced ability to advise on the use of key data platform architectural components (eg, Azure Lakehouse, Data Bricks, etc) to deliver and optimise the enterprise data architecture. Experience in data integration technologies, real-time data ingestion, and API-based integrations. Experience in SQL and other database management systems. Strong problem-solving skills for interpreting complex data requirements and translating them into feasible data architecture solutions and models. Experience in supporting the definition of an enterprise data vision and strategy, advising on implications and/or uplifts required to the enterprise data architecture. Experience designing and establishing data governance models and data management practices, ensuring data is correct and secure whilst still being accessible, in line with regulations and wider organisational policies. Able to present complex data-related initiatives and issues to senior non-data conversant audiences. Proven experience working with AI and Machine Learning models preferred, but not essential. What We Can Offer You We support your growth within the role, department, and across the company through internal opportunities. We offer a hybrid working model, allowing you to combine remote work with the opportunity to connect with your team in modern, welcoming office spaces. We encourage continuous learning with access to online platforms (eg, LinkedIn Learning), language courses, soft skills training, and various we'llbeing initiatives, including workshops and webinars. Join a diverse and inclusive work environment where your ideas are valued and your contributions make a difference.
Posted 1 week ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Minimum of 5 years experience as a software tester with proven experience in defining and leading QA cycles Strong experience with DBT (Data Build Tool) and writing/validating SQL models. Hands-on experience with Collibra for metadata management and data governance validation. Solid understanding of data warehousing concepts and ETL/ELT processes. Proficiency in SQL for data validation and transformation testing. Familiarity with version control tools like Git. Understanding of data governance, metadata, and data quality principles. Strong analytical and problem-solving skills with attention to detail.
Posted 1 week ago
3.0 - 8.0 years
10 - 14 Lacs
Chennai
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : OneStream Extensive Finance SmartCPM Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : Finance Background MBAPGCACFA in Finance RecommendedBachelor of EngineeringMS Azure Certification preferred Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and providing guidance to team members to foster a productive work environment. You will also engage in problem-solving discussions, ensuring that solutions are effectively implemented and meet the needs of stakeholders. Key Responsibilities:1.Lead the design, development, and enhancement of OneStream solutions to support financial consolidation, planning, and reporting.2.Collaborate with Finance, Accounting, and IT teams to gather business requirements and translate them into technical solutions within OneStream.3.Manage and maintain metadata, business rules, data integrations, and reporting structures in OneStream.4.Develop and maintain calculation scripts, business rules, and custom solutions using VB.NET or related scripting languages.5.Drive the monthly and quarterly close processes by ensuring timely and accurate data loads, validations, and reporting outputs.6.Develop and maintain dashboards, reports, and cube views for end-users across the organization.7.Provide end-user support and training, acting as a subject matter expert (SME) for OneStream across the company.8.Conduct system testing and troubleshooting, working with stakeholders and vendors as needed.9.Work on break-fixes and enhancement requests10.Deliver assigned work successfully and on-time with high-quality11.Develop documentation for delivered solution12.The candidate must have good troubleshooting skills and be able to think through issues and problems in a logical manner Technical Experience:1.3+ years of development Experience in ONESTREAM focused on but not limited to Financial Forecasting, Supply Chain Planning and HR/Sales/Incentive Compensation Management or similar use cases.2.6+ years of strong background and experience in consulting roles focused on Financial Planning/ Supply chain / Sales Performance Planning.3.Familiarity with SCRUM/Agile.4.Hands on in MS Excel using advanced formulae to develop Mock-Ups for clients.5.Ability to effectively communicate with client team and in client facing roles.6.Ability to effectively work remotely & if required Willing to travel out of Base Location Professional Attributes1.Good Communication skills as candidate will be speaking (in calls) and writing mails directly to Client.2.Candidate should have Leadership qualities and positive attitude to take the challenging task.3.Ability to manage a Team of 10 person in development and support projects4.Candidate should have good analytical and presentation skills.5.Strong sense of responsibility and positive attitude Educational Qualification :Finance Background (MBA/PG/CA/CFA in Finance) RecommendedBachelor of EngineeringMS Azure Certification preferred Qualification Finance Background MBAPGCACFA in Finance RecommendedBachelor of EngineeringMS Azure Certification preferred
Posted 1 week ago
14.0 - 19.0 years
20 - 25 Lacs
Kochi
Work from Office
Job Summary: We are seeking a highly experienced and visionary Databricks Data Architect with over 14 years in data engineering and architecture, including deep hands-on experience in designing and scaling Lakehouse architectures using Databricks . The ideal candidate will possess deep expertise across data modeling, data governance, real-time and batch processing, and cloud-native analytics using the Databricks platform. You will lead the strategy, design, and implementation of modern data architecture to drive enterprise-wide data initiatives and maximize the value from the Databricks platform. Key Responsibilities: Lead the architecture, design, and implementation of scalable and secure Lakehouse solutions using Databricks and Delta Lake . Define and implement data modeling best practices , including medallion architecture (bronze/silver/gold layers). Champion data quality and governance frameworks leveraging Databricks Unity Catalog for metadata, lineage, access control, and auditing. Architect real-time and batch data ingestion pipelines using Apache Spark Structured Streaming , Auto Loader , and Delta Live Tables (DLT) . Develop reusable templates, workflows, and libraries for data ingestion, transformation, and consumption across various domains. Collaborate with enterprise data governance and security teams to ensure compliance with regulatory and organizational data standards. Promote self-service analytics and data democratization by enabling business users through Databricks SQL and Power BI/Tableau integrations . Partner with Data Scientists and ML Engineers to enable ML workflows using MLflow , Feature Store , and Databricks Model Serving . Provide architectural leadership for enterprise data platforms, including performance optimization , cost governance , and CI/CD automation in Databricks. Define and drive the adoption of DevOps/MLOps best practices on Databricks using Databricks Repos , Git , Jobs , and Terraform . Mentor and lead engineering teams on modern data platform practices, Spark performance tuning , and efficient Delta Lake optimizations (Z-ordering, OPTIMIZE, VACUUM, etc.) . Technical Skills: 10+ years in Data Warehousing, Data Architecture, and Enterprise ETL design . 5+ years hands-on experience with Databricks on Azure/AWS/GCP , including advanced Apache Spark and Delta Lake . Strong command of SQL, PySpark, and Spark SQL for large-scale data transformation. Proficiency with Databricks Unity Catalog , Delta Live Tables , Autoloader , DBFS , Jobs , and Workflows . Hands-on experience with Databricks SQL and integration with BI tools (Power BI, Tableau, etc.). Experience implementing CI/CD on Databricks , using tools like Git , Azure DevOps , Terraform , and Databricks Repos . Proficient with streaming architecture using Spark Structured Streaming , Kafka , or Event Hubs/Kinesis . Understanding of ML lifecycle management with MLflow , and experience in deploying MLOps solutions on Databricks. Familiarity with cloud object stores (e.g., AWS S3, Azure Data Lake Gen2) and data lake architectures . Exposure to data cataloging and metadata management using Unity Catalog or third-party tools. Knowledge of orchestration tools like Airflow , Databricks Workflows , or Azure Data Factory . Experience with Docker/Kubernetes for containerization (optional, for cross-platform knowledge). Preferred Certifications (a plus): Databricks Certified Data Engineer Associate/Professional Databricks Certified Lakehouse Architect Microsoft Certified: Azure Data Engineer / Azure Solutions Architect AWS Certified Data Analytics - Specialty Google Professional Data Engineer
Posted 1 week ago
12.0 - 15.0 years
35 - 40 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
databricks ARCHITECT 12-15 YRS PUNE/BANGALORE/HYDERABAD Whats this role about? As Databricks Architect, you will be responsible for providing advisory and thought leadership on the provision of analytics environments leveraging Cloud based platforms, big data technologies, including integration with existing data and analytics platforms and tools. You will contribute in pre-sales and design, implement scalable data architectures, and information solutions covering data security, data privacy, data governance, metadata management, multi-tenancy and mixed workload management, and provide delivery oversight. Heres how youll contribute: The Databricks Architect at Zensar participates in end to end cycle from opportunity identification to its closure and takes up complete ownerships of project execution and provide valuable expertise in the project. You will do this by: Responding to client RFI, RFP documents with proper solution design including cost estimates Understanding customer requirements and create technical proposition Contributing to SoWs, technical project roadmaps, etc required for successful execution of projects leveraging Technical Scoping & Solutioning approach Managing and owning all aspects of technical development and delivery Understanding requirements and writing technical documents Ensuring code review and developing best practises Planning end to end technical scope of the project and customer engagement area including planning sprint and delivery Estimating effort, identifying risk and providing technical support whenever needed Demonstrating the ability to multitask and re-prioritizing responsibility based on dynamic requirements Leading and mentoring teams as needed Skills required to contribute: 12-15 Years of Data and Analytics experience with minimum 6+ Years in Big Data technologies and minimum 3 years in Azure / AWS / GCP Data Cloud native services including experience in Databricks Excellent communication and presentation skills. Hands-on experience with the Big Data stack (HDFS, SPARK, MapReduce, Hadoop, Sqoop, Pig, Hive, Hbase, Flume, Kafka) as well as with the No-SQL (e.g. MongoDB, HBase, Cassandra) Experience in job scheduling using Oozie or Airflow or any other ETL scheduler Experience working on cloud platforms (AWS/Azure/Google Cloud) to analyze, re-architect and re-platform on-premise data warehouses to cloud using native or 3rd party services. Design and build production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala and good understanding of Delta lake. Good experience in designing & delivering data analytics solutions using Azure, AWS or GCP Cloud native services. Good experience in Requirements Analysis and Solution Architecture Design, Data modelling, ETL, data integration and data migration design Well versed with Waterfall, Agile, Scrum and similar project delivery methodologies. Experienced in internal as well as external stakeholder management Experience in MDM / DQM / Data Governance technologies like Collibra, Atacama, Alation, Reltio will be added advantage. Databricks / SPARK or Cloud Data certification will be added advantage. Nice to have skills: Working experience with Snowflake, Reporting tools like Power BI/Tableau, Unix etc. How we d like you to lead: Take part in and host regular knowledge sharing sessions, mentor more junior members of the team and support the continuous development of our practice. We also want you to: Share project stories with the wider business. Provide recommendations and best practices for application development, platform development, and developer tools Actively stay abreast on industry best practices, share learnings, and experiment and apply cutting edge technologies Provide technical leadership and be a role model/coach to software engineers pursuing technical career path in engineering Provide/inspire innovations that fuel the growth of Intuit as a whole and generate creative ideas for emerging business needs Create content to promote your personal brand and Zensar to the world. Live Zensars values, both, internally and externally databricks ARCHITECT 12-15 YRS PUNE/BANGALORE/HYDERABAD Whats this role about? As Databricks Architect, you will be responsible for providing advisory and thought leadership on the provision of analytics environments leveraging Cloud based platforms, big data technologies, including integration with existing data and analytics platforms and tools. You will contribute in pre-sales and design, implement scalable data architectures, and information solutions covering data security, data privacy, data governance, metadata management, multi-tenancy and mixed workload management, and provide delivery oversight. Heres how youll contribute: The Databricks Architect at Zensar participates in end to end cycle from opportunity identification to its closure and takes up complete ownerships of project execution and provide valuable expertise in the project. You will do this by: Responding to client RFI, RFP documents with proper solution design including cost estimates Understanding customer requirements and create technical proposition Contributing to SoWs, technical project roadmaps, etc required for successful execution of projects leveraging Technical Scoping & Solutioning approach Managing and owning all aspects of technical development and delivery Understanding requirements and writing technical documents Ensuring code review and developing best practises Planning end to end technical scope of the project and customer engagement area including planning sprint and delivery Estimating effort, identifying risk and providing technical support whenever needed Demonstrating the ability to multitask and re-prioritizing responsibility based on dynamic requirements Leading and mentoring teams as needed Skills required to contribute: 12-15 Years of Data and Analytics experience with minimum 6+ Years in Big Data technologies and minimum 3 years in Azure / AWS / GCP Data Cloud native services including experience in Databricks Excellent communication and presentation skills. Hands-on experience with the Big Data stack (HDFS, SPARK, MapReduce, Hadoop, Sqoop, Pig, Hive, Hbase, Flume, Kafka) as well as with the No-SQL (e.g. MongoDB, HBase, Cassandra) Experience in job scheduling using Oozie or Airflow or any other ETL scheduler Experience working on cloud platforms (AWS/Azure/Google Cloud) to analyze, re-architect and re-platform on-premise data warehouses to cloud using native or 3rd party services. Design and build production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala and good understanding of Delta lake. Good experience in designing & delivering data analytics solutions using Azure, AWS or GCP Cloud native services. Good experience in Requirements Analysis and Solution Architecture Design, Data modelling, ETL, data integration and data migration design Well versed with Waterfall, Agile, Scrum and similar project delivery methodologies. Experienced in internal as well as external stakeholder management Experience in MDM / DQM / Data Governance technologies like Collibra, Atacama, Alation, Reltio will be added advantage. Databricks / SPARK or Cloud Data certification will be added advantage. Nice to have skills: Working experience with Snowflake, Reporting tools like Power BI/Tableau, Unix etc. How we d like you to lead: Take part in and host regular knowledge sharing sessions, mentor more junior members of the team and support the continuous development of our practice. We also want you to: Share project stories with the wider business. Provide recommendations and best practices for application development, platform development, and developer tools Actively stay abreast on industry best practices, share learnings, and experiment and apply cutting edge technologies Provide technical leadership and be a role model/coach to software engineers pursuing technical career path in engineering Provide/inspire innovations that fuel the growth of Intuit as a whole and generate creative ideas for emerging business needs Create content to promote your personal brand and Zensar to the world. Live Zensars values, both, internally and externally
Posted 1 week ago
3.0 - 6.0 years
10 - 14 Lacs
Gurugram
Work from Office
As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include: Comprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise SQL authoring, query, and cost optimisation primarily on Big Query. Python as an object-oriented scripting language. Data pipeline, data streaming and workflow management toolsDataflow, Pub Sub, Hadoop, spark-streaming Version control systemGIT & Preferable knowledge of Infrastructure as CodeTerraform. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience performing root cause analysis on internal and external data and processes to answer specific business questions Preferred technical and professional experience Experience building and optimising data pipelines, architectures and data sets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Working knowledge of message queuing, stream processing, and highly scalable data stores. Experience supporting and working with cross-functional teams in a dynamic environment. We are looking for a candidate with experience in a Data Engineer role, who are also familiar with Google Cloud Platform
Posted 1 week ago
1.0 - 3.0 years
3 - 6 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Locations - Pune/Bangalore/Hyderabad/Indore Contract duration- 6 months Responsibilities Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms. Implement business and IT data requirements through new data strategies and designs across all data platforms (relational & dimensional - MUST and NoSQL-optional) and data tools (reporting, visualization, analytics, and machine learning). Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Must have Payments Background Skills Hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). Experience with data warehouse, data lake, and enterprise big data platforms in multi-data-center contexts required. Good knowledge of metadata management, data modeling, and related tools (Erwin or ER Studio or others) required. Experience in team management, communication, and presentation. Experience with Erwin, Visio or any other relevant tool.
Posted 1 week ago
3.0 - 6.0 years
5 - 8 Lacs
Hyderabad
Work from Office
SAP professionals design, implement and deploy SAP solutions to achieve defined business goals. Maintain skills in SAP applications process design and configuration; SAP application design, development, integration, testing and deployment. Responsibilities: Responsible for the setting up of the SAP DMS , structure for large scale enterprise. Integration process and installation SAP DMS application with other core module of SAP DMS . Keeping the scope in mind, configure the system accordingly. Able to test and document the system. Train key users Experienced with SAP Documents, Document Structure, Document Search, Document Distribution Version Management, Document Vaulting, Integrated Viewer Status Management, Document Conversion, Document workflow, Interfacing Table specification. Should be also versed with the integration aspects of SAP DMS applications with other core modules of SAP ECC. Experience with Open Text, Document-um will be an advantage. Setting up SAP Knowledge management, SAP content server, experienced with Metadata search & configuration and Integration with workflow configuration, TREX integration and Security associated with DMS implementation . Should have worked with associated standard workflows across MM, QM and PM. Experienced in the setup of archive link to support retention o
Posted 1 week ago
1.0 - 8.0 years
2 - 6 Lacs
Hyderabad
Work from Office
Career Category Information Systems Job Description ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world s toughest diseases, and make people s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what s known today. ABOUT THE ROLE Role Description: We are seeking an MDM Associate Data Engineer with 2 -5 years of experience to support and enhance our enterprise MDM (Master Data Management) platforms using Informatica/Reltio. This role is critical in delivering high-quality master data solutions across the organization, utilizing modern tools like Databricks and AWS to drive insights and ensure data reliability. The ideal candidate will have strong SQL, data profiling, and experience working with cross-functional teams in a pharma environment. To succeed in this role, the candidate must have strong data engineering experience along with MDM knowledge, hence the candidates having only MDM experience are not eligible for this role. Candidate must have data engineering experience on technologies like (SQL, Python, PySpark , Databricks, AWS etc ), along with knowledge of MDM (Master Data Management) Roles Responsibilities: Analyze and manage customer master data using Reltio or Informatica MDM solutions. Perform advanced SQL queries and data analysis to validate and ensure master data integrity. Leverage Python, PySpark , and Databricks for scalable data processing and automation. Collaborate with business and data engineering teams for continuous improvement in MDM solutions. Implement data stewardship processes and workflows, including approval and DCR mechanisms. Utilize AWS cloud services for data storage and compute processes related to MDM. Contribute to metadata and data modeling activities. Track and manage data issues using tools such as JIRA and document processes in Confluence. Apply Life Sciences/Pharma industry context to ensure data standards and compliance. Basic Qualifications and Experience: Master s degree with 1 - 3 years of experience in Business, Engineering, IT or related field OR Bachelor s degree with 2 - 5 years of experience in Business, Engineering, IT or related field OR Diploma with 6 - 8 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Advanced SQL expertise and data wrangling. Strong experience in Python and PySpark for data transformation workflows. Strong experience with Databricks and AWS architecture. Must have knowledge of MDM, data governance, stewardship, and profiling practices. In addition to above, candidates having experience with Informatica or Reltio MDM platforms will be preferred. Good-to-Have Skills: Experience with IDQ, data modeling and approval workflow/DCR. Background in Life Sciences/Pharma industries. Familiarity with project tools like JIRA and Confluence. Strong grip on data engineering concepts. Professional Certifications : Any ETL certification ( e. g. Informatica) Any Data Analysis certification (SQL, Python, Databricks) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .
Posted 1 week ago
5.0 - 10.0 years
4 - 7 Lacs
Hyderabad
Work from Office
A Data Governance Specialist will be responsible for several technical tasks, including developing and maintaining Collibra workflows, creating and maintaining integrations between data systems, data catalogs, and data quality tools. Additionally, other tasks related to managing metadata, master data, and business glossary will be required. Key Responsibilities: Develop and maintain Collibra workflows to support data governance initiatives. Create and maintain integrations between data systems and data governance tools. Write and maintain data quality rules to measure data quality. Work with vendors to troubleshoot and resolve technical issues related to workflows and integrations. Work with other teams to ensure adherence to DG policies and standards. Assist in implementing data governance initiatives around data quality, master data, and metadata management. Qualifications: Strong programming skills. Knowledge of system integration and use of middleware solutions. Proficiency in SQL and relational databases. Understanding of data governance, including data quality, master data, and metadata management. Willingness to learn new tools and skills. Preferred Qualifications: Proficient with Java or Groovy. Proficient with Mulesoft or other middleware. Proficient with Collibra DIP, Collibra Data Quality, and DQLabs. Experience with AWS Redshift, and Databricks.
Posted 1 week ago
4.0 - 11.0 years
12 - 13 Lacs
Bengaluru
Work from Office
Job Description: QA DBT Engineer Job Location: Hyderabad / Bangalore / Chennai / Kolkata / Noida/ Gurgaon / Pune / Indore / Mumbai Minimum of 8 years experience as a software tester with proven experience in defining and leading QA cycles Strong experience with DBT (Data Build Tool) and writing/validating SQL models. Hands-on experience with Collibra for metadata management and data governance validation. Solid understanding of data warehousing concepts and ETL/ELT processes. Proficiency in SQL for data validation and transformation testing. Familiarity with version control tools like Git. Understanding of data governance, metadata, and data quality principles. Strong analytical and problem-solving skills with attention to detail. Recruitment fraud is a scheme in which fictitious job opportunities are offered to job seekers typically through online services, such as false websites, or through unsolicited emails claiming to be from the company. These emails may request recipients to provide personal information or to make payments as part of their illegitimate recruiting process. DXC does not make offers of employment via social media networks and DXC never asks for any money or payments from applicants at any point in the recruitment process, nor ask a job seeker to purchase IT or other equipment on our behalf. More information on employment scams is available here .
Posted 1 week ago
5.0 - 15.0 years
11 - 13 Lacs
Bengaluru
Work from Office
Job Description: Collibra Data Governance Specialist Job Location: Hyderabad / Bangalore / Chennai / Kolkata / Noida/ Gurgaon / Pune / Indore / Mumbai Required Skills 5 - 15 years of experience in data governance and/or metadata management. Hands-on experience with Collibra Data Governance Center (Collibra DGC), including workflow configuration, cataloging, and operating model customization . Strong knowledge of metadata management, data lineage, and data quality principles . Hands-on experience with Snowflake Familiarity with data integration tools and AWS cloud platform Experience with SQL and working knowledge of relational databases. Understanding of data privacy regulations (e. g. , GDPR, CCPA) and compliance frameworks. Preferred Skills Certifications such as Collibra Certified Solution Architect . Experience integrating Collibra with tools like Snowflake, Tableau or other BI/analytics platforms. Exposure to DataOps, MDM (Master Data Management) , and data governance frameworks like DAMA-DMBOK. Strong communication and stakeholder management skills. Recruitment fraud is a scheme in which fictitious job opportunities are offered to job seekers typically through online services, such as false websites, or through unsolicited emails claiming to be from the company. These emails may request recipients to provide personal information or to make payments as part of their illegitimate recruiting process. DXC does not make offers of employment via social media networks and DXC never asks for any money or payments from applicants at any point in the recruitment process, nor ask a job seeker to purchase IT or other equipment on our behalf. More information on employment scams is available here .
Posted 1 week ago
2.0 - 6.0 years
4 - 7 Lacs
Kolkata
Work from Office
Data Modeling Design: Design and maintain conceptual, logical, and physical data models to support transactional, analytical, and data warehousing systems. Develop data models to ensure data consistency, integrity, and quality across multiple banking functions and applications. Define and implement the best practices for data modeling, data integration, and metadata management. Data Architecture Collaboration: Collaborate with data architects to align models with enterprise data architecture and ensure optimal performance and scalability. Work with database administrators to translate logical models into physical database structures and optimize them for performance. Data Quality Governance: Establish data standards, definitions, and quality rules to ensure data accuracy, consistency, and compliance. Create and maintain data dictionaries and metadata repositories to support data governance and facilitate efficient data access. Stakeholder Engagement: Engage with business stakeholders, data scientists, and IT teams to understand data requirements and translate business needs into robust data models. Ensure data models support key business initiatives, such as regulatory reporting, analytics, and operational efficiency. Documentation Best Practices: Develop and maintain detailed documentation, including data models, entity-relationship diagrams, and mapping specifications. Implement data modeling standards and mentor team members to promote best practices in data management. Requirements: Educational Qualification: MBA / Engineering degree with relevant industry experience. Experience: Minimum of 10 years of experience in data modeling or as a data modeler within the banking industry. Proven expertise in designing data models across at least three of the following areas: Data Warehousing Analytics BI Data Mining Data Quality Metadata Management Technical Skills: Proficiency in data modeling tools such as Erwin, IBM InfoSphere, or SAP PowerDesigner. Strong SQL skills and experience in relational database systems like Oracle, SQL Server, or DB2. Familiarity with big data technologies and NoSQL databases is a plus. Knowledge of ETL processes and tools (e. g. , Informatica, Talend) and experience working with BI tools (e. g. , Tableau, Power BI). Knowledge of SAS for analytics, data manipulation, and data management is a plus. Strong understanding of data governance frameworks, data quality management, and regulatory compliance. Soft Skills: Strong analytical and problem-solving skills, with attention to detail and accuracy. Excellent communication and interpersonal skills, with the ability to translate technical data concepts for business stakeholders. Proven ability to work collaboratively in a cross-functional environment and manage multiple projects.
Posted 1 week ago
10.0 - 15.0 years
5 - 9 Lacs
Mumbai
Work from Office
Role Overview : We are hiring aTalend Data Quality Developerto design and implement robust data quality (DQ) frameworks in a Cloudera-based data lakehouse environment. The role focuses on building rule-driven validation and monitoring processes for migrated data pipelines, ensuring high levels of data trust and regulatory compliance across critical banking domains. Key Responsibilities : Design and implement data quality rules using Talend DQ Studio , tailored to validate customer, account, transaction, and KYC datasets within the Cloudera Lakehouse. Create reusable templates for profiling, validation, standardization, and exception handling. Integrate DQ checks within PySpark-based ingestion and transformation pipelines targeting Apache Iceberg tables . Ensure compatibility with Cloudera components (HDFS, Hive, Iceberg, Ranger, Atlas) and job orchestration frameworks (Airflow/Oozie). Perform initial and ongoing data profiling on source and target systems to detect data anomalies and drive rule definitions. Monitor and report DQ metrics through dashboards and exception reports. Work closely with data governance, architecture, and business teams to align DQ rules with enterprise definitions and regulatory requirements. Support lineage and metadata integration with tools like Apache Atlas or external catalogs. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience 5–10 years in data management, with 3+ years in Talend Data Quality tools. Platforms Experience in Cloudera Data Platform (CDP) , with understanding of Iceberg , Hive , HDFS , and Spark ecosystems. Languages/Tools Talend Studio (DQ module), SQL, Python (preferred), Bash scripting. Data Concepts Strong grasp of data quality dimensions—completeness, consistency, accuracy, timeliness, uniqueness. Banking Exposure Experience with financial services data (CIF, AML, KYC, product masters) is highly preferred.
Posted 1 week ago
3.0 - 5.0 years
5 - 7 Lacs
Hyderabad
Work from Office
Location: Hyderabad (Gachibowli) Experience: 3-5 Years Industry: Healthcare (Preferred) Overview We are seeking a results-oriented Digital Marketing Analyst/Executive with a strong background in SEO, especially Local SEO, and a passion for driving measurable results. If you have experience in healthcare and have successfully scaled organic traffic across multiple locations, this is your opportunity to shine. Your mission will be to take our web traffic from 25K to 150K+ in just 6-9 months . Roles & Responsibilities: Local SEO Mastery Manage and optimize Google Business Profiles (GMB) for multiple locations. Implement geo-targeted strategies to rank for "near me" and location-specific search terms . Ensure NAP (Name, Address, Phone) consistency across directories. Build and optimize location-specific landing pages and blog content . Organic Traffic Growth & SEO Strategy Design and execute end-to-end SEO strategies for aggressive traffic growth. Conduct keyword research, competitor analysis, and SERP audits. Optimize on-page elements: metadata, headings, schema, and content structures. Develop content clusters and pillar pages to build domain authority. Link Building Execution Lead white-hat link-building campaigns : guest blogging, PR outreach, citations. Build links from niche healthcare directories, influencers, and forums. Monitor and maintain a healthy backlink profile. Analytics & Reporting Track performance using Google Analytics 4, Search Console, Looker Studio . Set and measure KPIs like traffic growth, lead generation, and keyword rankings. Conduct regular SEO audits and performance reviews. Multi-Location SEO Management Coordinate SEO efforts across 25+ physical locations . Resolve common SEO issues such as duplicate content, cannibalization, and internal linking. Ensure that each location page is SEO-optimized and user-friendly . Collaboration & Strategy Work closely with content creators, designers, developers, and paid media teams. Provide SEO input for UX/CRO improvements. Stay current on SEO trends, Google algorithm updates, and AI-driven SEO tools. Candidate Requirements 3-5 years of proven experience in digital marketing with a focus on SEO. Experience working in or with the healthcare sector . Demonstrated ability to grow organic traffic 3x-5x within a short time frame. Proficiency in local SEO management for 20+ locations . Expertise with tools like Ahrefs, SEMrush, Screaming Frog, GMB, and GA4. Strong communication, analytical, and project management skills. Note : The selected candidate will be required to work on-site at the clients location, 6 days a week.
Posted 1 week ago
3.0 - 6.0 years
10 - 14 Lacs
Bengaluru
Work from Office
As an entry level Application Developer at IBM, you'll work with clients to co-create solutions to major real-world challenges by using best practice technologies, tools, techniques, and products to translate system requirements into the design and development of customized systems. In your role, you may be responsible for: Working across the entire system architecture to design, develop, and support high quality, scalable products and interfaces for our clients Collaborate with cross-functional teams to understand requirements and define technical specifications for generative AI projects Employing IBM's Design Thinking to create products that provide a great user experience along with high performance, security, quality, and stability Working with a variety of relational databases (SQL, Postgres, DB2, MongoDB), operating systems (Linux, Windows, iOS, Android), and modern UI frameworks (Backbone.js, AngularJS, React, Ember.js, Bootstrap, and JQuery) Creating everything from mockups and UI components to algorithms and data structures as you deliver a viable product Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise SQL authoring, query, and cost optimisation primarily on Big Query. Python as an object-oriented scripting language. Data pipeline, data streaming and workflow management toolsDataflow, Pub Sub, Hadoop, spark-streaming Version control systemGIT & Preferable knowledge of Infrastructure as CodeTerraform. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience performing root cause analysis on internal and external data and processes to answer specific business questions. Preferred technical and professional experience Experience building and optimising data pipelines, architectures and data sets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Working knowledge of message queuing, stream processing, and highly scalable data stores. Experience supporting and working with cross-functional teams in a dynamic environment. We are looking for a candidate with experience in a Data Engineer role, who are also familiar with Google Cloud Platform.
Posted 1 week ago
10.0 - 12.0 years
8 - 13 Lacs
Mumbai
Work from Office
John Cockerill, enablers of opportunities Driven since 1817 by the entrepreneurial spirit and thirst for innovation of its founder, the John Cockerill Group develops large-scale technological solutions to meet the needs of its time: facilitating access to low carbon energies, enabling sustainable industrial production, preserving natural resources, contributing to greener mobility, enhancing security and installing essential infrastructures. Its offer to businesses, governments and communities consists of services and associated equipment for the sectors of energy, defence, industry, the environment, transports, and infrastructures. With over 6,000 employees, John Cockerill achieved a turnover of 1,209 billion in 2023 in 29 countries, on 5 continents. Location - Navi Mumbai The John Cockerill Group develops large-scale technological solutions to meet the needs of our time: preserving natural resources, contributing to greener mobility, producing sustainably, combating insecurity, and facilitating access to renewable energy. Its offer to companies, governments and local authorities takes the form of services and associated equipment for the energy, defense, industry, environment, transport, and infrastructure sectors. Driven since 1817 by the entrepreneurial spirit and thirst for innovation of its founder, the group employs 6,000 people who have enabled it to achieve revenues of more than one billion euros in 2020 in 23 countries on 5 continents. Context The John Cockerill group manufactures and supplies water electrolysis equipment for hydrogen production. John Cockerill develops integrated solutions including power supply, water treatment, compressor and H2 storage, hydrogen refueling station. To support the development of the Hydrogen Business Line, John Cockerill is looking for an Analyst and SAP Data Manager . The role may evolve according to business needs. The Analyst and SAP Data Manager will report to the Supply Chain Process Manager. Responsibilites The Analyst and SAP Data Manager will be responsible for SAP Materiel Requests Management Validate material master data creation requests. Follow-up pending requests, edit weekly status. Be the point of contact for Supply Chain SAP Master data related topics. Escalate accordingly when the agreement cannot be reached. The Analyst and SAP Data Manager will be responsible for Creation and maintenance : Create Supply Chain master data in SAP, maintain and complete in accordance with requirement. Follow up the delivery of Master Data, both in terms of deadlines and quality. Cleanse and correct incorrect data. The Analyst and SAP Data Manager will also be responsible for the maintenance and optimization of the team s SharePoint environment. This includes: Regularly cleaning and organizing folders and files to ensure accessibility, Adding and maintaining metadata to improve document classification and searchability, Ensuring compliance with internal data governance policies, Collaborating with team members to ensure SharePoint content remains up-to-date and relevant. The Analyst and SAP Data Manager will work closely with Global Master Data Officer, Configuration Management, Engineering Department, Industrialization Engineers, Quality, Procurement, Project Management and IT Department. Profile Min 10 to 12 years experience in setting up and maintaining Master Data in the industrial sector. Experience with SAP MM is an asset. Good technical competencies. Experience with SharePoint administration and metadata management Accurate in keeping track of open actions and realizations. Resilient and solution-oriented (working on developing projects in an environment and business that is expected to grow exponentially in the coming years.) Solution and result oriented. Reliable. Team player. . Do you want to work for an innovative company that will allow you to take up technical challenges on a daily basis? ! Discover our job opportunities in details on www.johncockerill.com
Posted 1 week ago
3.0 - 8.0 years
45 - 50 Lacs
Hyderabad
Work from Office
Payroll Technology at Amazon is all about enabling our business to perform at scale as efficiently as possible with no defects. As Amazons workforce grows, both in size and geography, Amazons payroll operations become increasingly complex, and our customers are asked to do more with less. Process can only get them so far, and thats where we come in with technology solutions to integrate and automate systems, detect defects before payment, and provide insights. As a data engineer in payroll, you will have to onboard payroll vendors across various geographies by building versatile and scalable design solutions. Having strong written and verbal communication, and the ability to communicate with end users in non-technical terms, is vital to your long-term success. The ideal candidate will have experience working with large datasets, distributed computing technologies and service-oriented architecture. The candidate should relish working with large volumes of data, and enjoys the challenge of highly complex technical contexts. He/she should be an expert with data modeling, ETL design and business intelligence tools and has hand-on knowledge on columnar databases. He/she is a self-starter, comfortable with ambiguity, able to think big and enjoys working in a fast-paced team. Responsibilities: Design, build and own all the components of a high-volume data warehouse end to end. Build efficient data models using industry best practices and metadata for ad-hoc and pre-built reporting Provide wing-to-wing data engineering support for project lifecycle execution (design, execution and risk assessment) Interface with business customers, gathering requirements and delivering complete data & reporting solutions owning the design, development, and maintenance of ongoing metrics, reports, dashboards, etc. to drive key business decisions Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers Interface with other technology teams to extract, transform, and load (ETL) data from a wide variety of data sources Own the functional and nonfunctional scaling of software systems in your ownership area. Implement big data solutions for distributed computing. Willing to learn and develop strong skill set in AWS technologies As a DE on our team, you will be responsible for leading the data modelling, database design, and launch of some of the core data pipelines. You will have significant influence on our overall strategy by helping define the data model, drive the database design, and spearhead the best practices to delivery high quality products. A day in the life You are expected to do data modelling, database design, build data pipelines as per Amazon standards, design reviews, and supporting data privacy and security initiatives. You will attend regular stand-up meetings and provide your updates. You will keep an eye out for opportunities to improve the product or user experience and suggest those enhancements. You will participate in requirement grooming meetings to ensure the use cases we deliver are complete and functional. You will take your turn at on-call and own production operational maintenance. You will respond to customer issues and monitor databases for healthy state and performance. About the team Our mission is to build applications which can solve challenges Global Payroll Operations teams face on daily basis, automate the tasks they perform manually, provide them seamless experience by integrating with other dependent systems, and eventually reduce Pay Defects and improve pay accuracy 3+ years of data engineering experience 4+ years of SQL experience Experience with data modeling, warehousing and building ETL pipelines Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions Experience with non-relational databases / data stores (object storage, document or key-value stores, graph databases, column-family databases)
Posted 1 week ago
3.0 - 6.0 years
10 - 14 Lacs
Hyderabad
Work from Office
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You’ll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you’ll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you’ll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviour’s. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in the integration efforts between Alation and Manta, ensuring seamless data flow and compatibility. Collaborate with cross-functional teams to gather requirements and design solutions that leverage both Alation and Manta platforms effectively. Develop and maintain data governance processes and standards within Alation, leveraging Manta's data lineage capabilities. Analyze data lineage and metadata to provide insights into data quality, compliance, and usage patterns.. Preferred technical and professional experience Lead the evaluation and implementation of new features and updates for both Alation and Manta platforms Ensuring alignment with organizational goals and objectives. Drive continuous improvement initiatives to enhance the efficiency and effectiveness of data management processes, leveraging Alati
Posted 1 week ago
3.0 - 8.0 years
7 - 11 Lacs
Mumbai
Work from Office
3+ years of hands-on experience on Collibra tool. Knowledge of Collibra DGC version 5.7 and onward Experience on Spring boot development Experience on Groovy and Flow able for BPMN workflow Development. Experience in both Business And Technical Metadata Experience on platform activity like job server setup and upgrade " Working as SME in data governance, metadata management and data catalog solutions, specifically on Collibra Data Governance. Client interface and consulting skills required Experience in Data Governance of wide variety of data types (structured, semi-structured and unstructured data) and wide variety of data sources (HDFS, S3, Kafka, Cassandra, Hive, HBase, Elastic Search) Partner with Data Stewards for requirements, integrations and processes, participate in meetings and working sessions Partner with Data Management and integration leads to improve Data Management technologies and processes. Working experience of Collibra operating model, workflow BPMN development, and how to integrate various applications or systems with Collibra Experience in setting up peoples roles, responsibilities and controls, data ownership, workflows and common processes Integrate Collibra with other enterprise toolsData Quality Tool, Data Catalog Tool, Master Data Management Solutions Develop and configure all Collibra customized workflows 10. Develop API (REST, SOAP) to expose the metadata functionalities to the end-users Location :Pan India
Posted 1 week ago
12.0 - 15.0 years
55 - 60 Lacs
Ahmedabad, Chennai, Bengaluru
Work from Office
Dear Candidate, We are looking for a Data Governance Specialist to establish and maintain data governance policies, ensuring data quality, compliance, and responsible usage across the organization. Key Responsibilities: Define and enforce data governance policies and standards. Work with data owners and stewards to improve data quality. Monitor compliance with data privacy regulations (GDPR, HIPAA, etc.). Support metadata management, data lineage, and cataloging initiatives. Promote data literacy across departments. Required Skills & Qualifications: Experience with data governance tools (Collibra, Alation, Informatica). Knowledge of regulatory frameworks and data privacy laws. Strong analytical and documentation skills. Understanding of data architecture, MDM, and data stewardship. Excellent stakeholder management and communication skills. Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Srinivasa Reddy Kandi Delivery Manager Integra Technologies
Posted 1 week ago
3.0 - 6.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Happiest Minds Technologies Pvt.Ltd is looking for ANALYST to join our dynamic team and embark on a rewarding career journey Managing master data, including creation, updates, and deletion. Managing users and user roles. Provide quality assurance of imported data, working with quality assurance analysts if necessary. Commissioning and decommissioning of data sets. Processing confidential data and information according to guidelines. Helping develop reports and analysis. Managing and designing the reporting environment, including data sources, security, and metadata. Supporting the data warehouse in identifying and revising reporting requirements. Supporting initiatives for data integrity and normalization. Assessing tests and implementing new or upgraded software and assisting with strategic decisions on new systems. Generating reports from single or multiple systems. Troubleshooting the reporting database environment and reports. Evaluating changes and updates to source production systems. Training end-users on new reports and dashboards. Providing technical expertise in data storage structures, data mining, and data cleansing.
Posted 1 week ago
0.0 - 4.0 years
2 - 6 Lacs
Hyderabad
Work from Office
CHARTING NOW VISUAL DATA SOLUTIONS PRIVATE LIMITED is looking for Analyst to join our dynamic team and embark on a rewarding career journey Managing master data, including creation, updates, and deletion. Managing users and user roles. Provide quality assurance of imported data, working with quality assurance analysts if necessary. Commissioning and decommissioning of data sets. Processing confidential data and information according to guidelines. Helping develop reports and analysis. Managing and designing the reporting environment, including data sources, security, and metadata. Supporting the data warehouse in identifying and revising reporting requirements. Supporting initiatives for data integrity and normalization. Assessing tests and implementing new or upgraded software and assisting with strategic decisions on new systems. Generating reports from single or multiple systems. Troubleshooting the reporting database environment and reports. Evaluating changes and updates to source production systems. Training end-users on new reports and dashboards. Providing technical expertise in data storage structures, data mining, and data cleansing. Good proficiency in MS office suits, viz. Excel PowerPoint Good command over written and verbal communication Willing to work in flexible shifts
Posted 1 week ago
1.0 - 5.0 years
6 - 10 Lacs
Mumbai, Mumbai Suburban
Work from Office
An SEO Executive job description typically involves developing and implementing strategies to improve a website's visibility and organic traffic on search engines, including keyword research, on-page optimization, and link building. Key Responsibilities: SEO Strategy Development and Execution: Develop and implement effective SEO strategies to improve website rankings and drive organic traffic. Conduct keyword research and analysis to identify relevant keywords and phrases. Optimize website content and structure for search engines. Implement on-page SEO techniques, such as optimizing meta descriptions, titles, and headings. Develop and implement link-building strategies to increase website authority and visibility. Website Audits and Analysis: Perform regular website audits to identify technical issues and areas for improvement. Monitor and analyze website traffic and user behavior using analytics tools (e.g., Google Analytics). Track SEO performance metrics and identify areas for optimization. Content Strategy and Creation: Work with content teams to develop SEO-friendly content strategies. Provide guidance and support to content creators on SEO best practices. Collaboration and Communication: Collaborate with web development and marketing teams to ensure SEO best practices are implemented. Communicate SEO strategies and results to stakeholders. Staying Updated: Stay up-to-date with the latest SEO trends and best practices. Continuously learn and adapt to changes in search engine algorithms. Skills and Qualifications: Strong analytical skills: Ability to analyze data and identify trends. Excellent communication skills: Ability to communicate SEO strategies and results effectively. Knowledge of SEO tools and techniques: Familiarity with tools like Google Analytics, Google Search Console, and other SEO tools. Understanding of search engine algorithms: Knowledge of how search engines work and how to optimize websites for them. Experience with content creation and optimization: Ability to create and optimize content for search engines. Experience with link building: Knowledge of effective link-building strategies. Bachelor's degree in a related field (e.g., marketing, communications) is often preferred . As an ASO Executive, you will be responsible for optimizing mobile app visibility and performance in app stores through keyword research, listing optimization, and data analysis to drive organic downloads and user engagement. Here's a more detailed breakdown of the role: Keyword Research and Strategy: Conduct thorough keyword research to identify relevant and high-performing keywords for app listings. Develop and implement ASO strategies based on keyword analysis and competitor research. Continuously update keywords and ASO strategies based on trends, seasonality, and competitor actions. App Store Listing Optimization: Optimize app titles, descriptions, and metadata using relevant keywords. Collaborate with design teams to create visually appealing and engaging app store creatives. Ensure app listings are optimized for different languages and regions. Data Analysis and Reporting: Monitor and track app store rankings, downloads, and other key metrics. Analyze app performance data to identify areas for improvement and optimize ASO efforts. Provide regular performance reports and insights to stakeholders. Competitor Analysis: Conduct regular competitor analysis to identify best practices and opportunities for improvement. Stay up-to-date on industry trends and best practices in ASO. Collaboration: Collaborate with product, marketing, and design teams to ensure ASO efforts align with overall business goals. Work with data analytics teams to establish user journey funnels. Tools and Technologies: Familiarity with ASO tools and platforms (e.g., AppTweak, AppAnnie, App Radar). Experience with data analysis and reporting tools. Skills and Qualifications: Strong analytical and problem-solving skills. Excellent communication and collaboration skills. Experience with keyword research and ASO strategies. Familiarity with app store optimization best practices. Ability to work independently and as part of a team. Strong understanding of mobile user acquisition and marketing. Proficiency in data analysis and reporting tools. Experience with ASO tools and platforms (e.g., AppTweak, AppAnnie, App Radar). Good understanding of SEO principles.
Posted 1 week ago
4.0 - 9.0 years
5 - 8 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Will be responsible for planning delivery, assess risks, gather requirements, design, development delivery of Account Reconciliation Cloud Service (ARCS) and Oracle Financial Consolidation and Close Cloud (FCCS) applications Will focus on technical delivery and will be responsible for the quality of deliverables Will work independently with technical/functional direction from the leads Will provide functional knowledge and specialization in the core EPM processes including Consolidation and Reporting and Account Reconciliations to translate into system solutions Will be expected to contribute as an individual player or lead a team of developers to implement a solution, manage day-to-day reporting and delivery Will be expected to play a pivotal role in all activities ranging from requirement gathering till Hypercare Qualifications Required: 3-6 years of project experience in implementation of Oracle FCCS and/or ARCS Project experience in Enterprise Performance Management Suite of Products 11.x and/or Oracle FCCS Hands-on experience in building metadata, designing webforms, data reconciliation, building dashboards and creating complex business rules to cater custom consolidations Proficient in creation of Business Rules Custom consolidations, Intercompany eliminations, automating Cash Flow, Custom Translations and Currency Overrides, Dynamic calculations Must have a good knowledge of intercompany eliminations, currency translations financial statement presentations Should have excellent Excel, VB and Jython scripting skills Knowledge of upgrade migration from On-Premises to Cloud Preferred: Should be involved in the Design, Implementation, Prototyping, Enhancements and Performance tuning of ARCS and FCCS applications Industry specialization in domains like Media and Entertainment, Consumer and Industrial Products, Banking, etc. Should have experience of conducting User Acceptance Test, and preparing deliverables such as design documents, test documentation, training materials and administration/procedural guides Should have understanding and experience of software development best practices Good to have knowledge of any Relationship DataBase Management Systems like Oracle, SQL Server Exposure to traditional and Agile project delivery
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2