Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 12.0 years
0 Lacs
kolkata, west bengal
On-site
The role requires you to design and implement data modeling solutions using relational, dimensional, and NoSQL databases. You will closely collaborate with data architects to create customized databases utilizing a blend of conceptual, physical, and logical data models. As a data modeler, your responsibilities include designing, implementing, and documenting data architecture and modeling solutions across various database types to support enterprise information management, business intelligence, machine learning, and data science initiatives. Your key responsibilities will involve implementing business and IT data requirements by devising new data strategies and designs for different data platforms and tools. You will engage with business and application teams to execute data strategies, establish data flows, and develop conceptual, logical, and physical data models. Moreover, you will define and enforce data modeling standards, tools, and best practices, while also identifying architecture, infrastructure, data interfaces, security considerations, analytic models, and data visualization aspects. Additionally, you will undertake hands-on tasks such as modeling, design, configuration, installation, performance tuning, and sandbox proof of concept. It is crucial to work proactively and independently to fulfill project requirements, communicate challenges effectively, and mitigate project delivery risks. Qualifications required for this role include a BE/B.Tech degree or equivalent and a minimum of 8 years of hands-on experience in relational, dimensional, and/or analytic domains, involving RDBMS, dimensional, NoSQL platforms, and data ingestion protocols. Proficiency in data warehouse, data lake, and big data platforms within multi-data-center environments is essential. Familiarity with metadata management, data modeling tools (e.g., Erwin, ER Studio), and team management skills are also necessary. Your primary skills should encompass developing conceptual, logical, and physical data models, implementing RDBMS, ODS, data marts, and data lakes, and ensuring optimal data query performance. You will be responsible for expanding existing data architecture and employing best practices. The ability to work both independently and collaboratively is vital for this role. Preferred skills for this position include experience with data modeling tools and methodologies, as well as strong analytical and problem-solving abilities.,
Posted 1 day ago
15.0 - 21.0 years
0 Lacs
karnataka
On-site
The Data Architecture Specialist Join a team of data architects dedicated to designing and implementing industry-relevant reinventions that help organizations achieve exceptional business value through technology. Practice: Technology Strategy & Advisory, Capability Network Areas of Work: Data Architecture Level: Sr Manager Location: Bangalore/Mumbai/Pune/Gurugram Years of Experience: 15 to 21 years Explore an Exciting Career at Accenture Are you a problem solver with a passion for Tech-driven transformation Do you thrive on designing, building, and implementing strategies to enhance business architecture performance Are you eager to contribute to an inclusive, diverse, and collaborative culture Accenture offers a host of exciting global opportunities in Technology Strategy & Advisory for individuals like you. The Practice- A Brief Sketch: The Technology Strategy & Advisory team at Accenture assists clients in achieving growth and efficiency through innovative R&D transformation, focusing on redefining business models using agile methodologies. As a member of this high-performing team, you will collaborate closely with clients to unlock the value of data, architecture, and AI, driving business agility and transformation towards a real-time enterprise. As a leading Data Architecture Consulting professional, your responsibilities will include: - Business Problem Data Analysis: Identifying, assessing, and solving complex business problems through in-depth evaluation. - Technology-driven journey intersection: Assisting clients in designing, architecting, and scaling their journey towards new technology-driven growth. - Architecture Transformation: Enabling architecture transformation to solve key business problems and transition to a to-be enterprise environment. - High Performance Growth and Innovation: Supporting clients in building capabilities for growth and innovation to sustain high performance. Key Responsibilities: - Present data strategy and develop technology solutions to drive discussions at the C-suite/senior leadership level. - Utilize expertise in technologies such as big data, data integration, data governance, cloud platforms, data modeling tools, and data warehouse environments. - Lead proof of concept implementations and define plans for scaling across multiple technology domains. - Demonstrate creative and analytical problem-solving skills. - Understand key value drivers of a business and how they impact engagement scope and approach. - Develop client relationships and collaborate effectively with key stakeholders. - Lead and motivate diverse teams to achieve common goals. Qualifications: - MBA from a tier 1 institute - Prior experience in assessing Information Strategy Maturity, data monetization, defining data-based strategy, cloud platforms, Data Governance, and evaluating products and frameworks. - Practical industry expertise in Financial Services, Retail, Telecommunications, Life Sciences, Mining, etc., or equivalent domains. Join Accenture's Technology Strategy & Advisory team to leverage your skills, expertise, and leadership in driving transformative data architecture solutions for global clients.,
Posted 1 day ago
10.0 - 14.0 years
0 Lacs
haryana
On-site
As a Technical Consultant/Technical Architect with expertise in Fund Accounting, Oracle, and Informatica, you will collaborate with Delivery Managers, System/Business Analysts, and other subject matter experts to comprehend project requirements and design effective solutions. You will play a key role in estimating efforts for new projects and proposals, as well as producing/reviewing technical specifications and unit test cases for ongoing interfaces development. Your responsibilities will include developing and implementing standards, procedures, and best practices for data maintenance, reconciliation, and exception management. You will be expected to demonstrate technical leadership, produce design/technical specifications, propose solutions, and estimate project timelines. Additionally, your role will involve guiding and mentoring junior team members in developing solutions on the GFDR platform. Requirements: - Possess 10-12 years of experience in technical leadership within data warehousing and Business Intelligence domains - Proficiency in Oracle SQL/PLSQL and stored procedures - Familiarity with Source Control Tools (Clear Case preferred) - Sound understanding of Data Warehouse, Datamart, and ODS concepts - Experience in UNIX and PERL scripting - Proficiency in standard ETL tools like Informatica Power Centre - Demonstrated technical leadership in Eagle, Oracle, Unix Scripting, Perl, and scheduling tools such as Autosys/Control - Experience with job scheduling tools (Control-M preferred) - Strong knowledge of data modeling, data normalization, and performance optimization techniques - Ability to guide/mentor juniors in solution building and troubleshooting - Exposure to fund accounting concepts/systems and master data management is desirable - Familiarity with data distribution and access concepts and ability to translate conceptual models into physical ones - Excellent interpersonal and communication skills - Capability to collaborate effectively with cross-functional teams - Willingness to work as part of a team engaged in both development and production support activities Industry: IT/Computers-Software Role: Technical Architect Key Skills: Oracle, PL/SQL, Informatica, Autosys/Control, Fund Accounting, Eagle Education: B.E/B.Tech If you meet the qualifications and are excited about taking on a challenging and rewarding role as a Technical Consultant/Technical Architect with a focus on Fund Accounting, Oracle, and Informatica, we encourage you to reach out to us at jobs@augustainfotech.com.,
Posted 1 day ago
8.0 - 12.0 years
0 Lacs
indore, madhya pradesh
On-site
You should hold a Bachelor's degree in Physics, Mathematics, Engineering, Metallurgy, or Computer Science, along with an MSc in a relevant field such as Physics, Mathematics, Engineering, Computer Science, Chemistry, or Metallurgy. Additionally, you should possess at least 8 years of experience in Data Science and Analytics delivery. Your expertise should include deep knowledge of machine learning, statistics, optimization, and related fields. Proficiency in programming languages like R and Python is essential, as well as experience with machine learning skills such as Natural Language Processing (NLP) and deep learning techniques. Furthermore, you should have hands-on experience with deep learning frameworks like TensorFlow, Keras, Theano, or PyTorch, and be familiar with working with large datasets, including knowledge of extracting data from cloud platforms and the Hadoop ecosystem. Experience in Data Visualization tools like MS Power BI or Tableau, as well as proficiency in SQL and working with RDBMS for data extraction and management, is required. An understanding of Data Warehouse fundamentals, experience in productionizing Machine Learning models in cloud platforms like Azure, GCP, or AWS, and domain experience in the manufacturing industry would be advantageous. Demonstrated leadership skills in nurturing technical talent, successfully completing complex data science projects, and excellent written and verbal communication are essential. As an AI Expert with a minimum of 10 years of experience, your key responsibilities will include serving as a technical expert, providing guidance in the development and implementation of AI solutions, and collaborating with cross-functional teams to integrate AI technologies into products and services. You will actively participate in Agile methodologies, contribute to PI planning, and support the technical planning of products. Additionally, you will analyze technical requirements, propose AI-based solutions, collaborate with stakeholders to design AI models that meet business objectives, and stay updated on the latest advancements in AI technologies. Your role will involve conducting code reviews, mentoring team members, and driving the adoption of AI technologies across the organization. Strong problem-solving skills, a proactive approach to problem resolution, and the ability to work under tight deadlines without compromising quality are crucial for this role. Overall, you will play a critical role in driving significant impact and value in building and growing the Data Science Centre of Excellence, providing machine learning methodology leadership, and designing various POCs using ML/DL/NLP solutions for enterprise problems. Your ability to learn new technologies and techniques, work in a fast-paced environment, and partner with the business to unlock value through data projects will be key to your success in this position.,
Posted 1 day ago
2.0 - 6.0 years
0 - 0 Lacs
chennai, pune
On-site
Key Responsibilities Design, develop, and execute test cases for ETL processes and data validation Perform database testing using SQL to validate data transformations, data quality, and data integrity Collaborate with BI, ETL, and data engineering teams to understand business and technical requirements Document test plans, test strategies, and test case execution results Analyze and report bugs, coordinate with developers to ensure timely resolution Conduct end-to-end data flow validation from source to target systems Skills Required 23 years of hands-on experience in ETL and Database Testing Strong proficiency in SQL (Joins, Aggregations, Subqueries) Good understanding of data warehousing concepts and ETL lifecycle Experience with tools like Informatica, Talend, SSIS (preferred but not mandatory) Ability to analyze complex datasets and identify data anomalies Good communication and documentation skills To Apply Walk-in / Contact: White Horse Manpower #12, Office 156, 3rd Floor, Jumma Masjid Golden Complex, Jumma Masjid Road, Bangalore 560051. Contact: 8553281886
Posted 2 days ago
2.0 - 6.0 years
0 - 0 Lacs
bangalore, pune
On-site
Key Responsibilities Design, develop, and execute ETL test plans, test cases, and test scripts Perform data validation and transformation logic checks Validate data movement across source, staging, and target systems Use Python scripts for automation and data testing Collaborate with development and business teams to ensure quality delivery Log, track, and verify resolution of issues and bugs Skills Required 23 years of experience in ETL testing and data warehouse concepts Strong knowledge of SQL for data validation and querying Hands-on experience with Python scripting for test automation Good understanding of ETL tools (like Informatica, Talend, etc. if any) Ability to handle large volumes of data and complex data mapping scenarios Good communication and problem-solving skills To Apply Walk-in / Contact: White Horse Manpower #12, Office 156, 3rd Floor, Jumma Masjid Golden Complex, Jumma Masjid Road, Bangalore 560051. Contact: 9632024646
Posted 2 days ago
15.0 - 21.0 years
0 Lacs
haryana
On-site
As a Data Architecture Specialist at Accenture, you will be part of a team of data architects focused on designing and executing industry-relevant reinventions that help organizations achieve exceptional business value through technology. You will be working in the Technology Strategy & Advisory practice, within the Capability Network, with a focus on Data Architecture at a Senior Manager level in locations like Bangalore, Mumbai, Pune, or Gurugram, requiring 15 to 21 years of experience. Accenture offers an exciting career opportunity for individuals who are problem solvers and passionate about technology-driven transformation. If you enjoy designing, building, and implementing strategies to enhance business architecture performance and want to be part of an inclusive, diverse, and collaborative culture, then Accenture Technology Strategy & Advisory is the place for you. In this role, you will collaborate with clients to unlock the value of data, architecture, and AI to drive business agility and transformation towards a real-time enterprise. Your responsibilities will include identifying and solving complex business problems through data analysis, helping clients design and scale their technology-driven journey, enabling architecture transformation, and assisting clients in building capabilities for growth and innovation. To excel in this role, you will need to present data strategy, develop technology solutions, and engage in C-suite level discussions. You should have a deep understanding of technologies such as big data, data integration, data governance, cloud platforms, and data modeling tools. Leading proof of concept implementations, demonstrating creative problem-solving abilities, leveraging business value drivers, developing client relationships, collaborating with diverse teams, and exhibiting strong leadership, communication, and organizational skills are key aspects of this role. If you are looking to bring your best skills forward and be part of a dynamic team that thrives on innovation and growth, the Data Architecture Specialist role at Accenture is the perfect opportunity for you.,
Posted 2 days ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
Your Responsibilities Implement business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL). Collaborate with solution teams and Data Architects to implement data strategies, build data flows, and develop logical/physical data models. Work with Data Architects to define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Engage in hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. Proactively and independently address project requirements and articulate issues/challenges to reduce project delivery risks. Your Profile Bachelor's degree in computer/data science technical or related experience. Possess 7+ years of hands-on relational, dimensional, and/or analytic experience utilizing RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols. Demonstrated experience with data warehouse, Data Lake, and enterprise big data platforms in multi-data-center contexts. Proficient in metadata management, data modeling, and related tools (e.g., Erwin, ER Studio). Preferred experience with services in Azure/Azure Databricks (Azure Data Factory, Azure Data Lake Storage, Azure Synapse & Azure Databricks) and working on SAP Datasphere is a plus. Experience in team management, communication, and presentation. Understanding of agile delivery methodology and experience working in a scrum environment. Ability to translate business needs into data vault and dimensional data models supporting long-term solutions. Collaborate with the Application Development team to implement data strategies, create logical and physical data models using best practices to ensure high data quality and reduced redundancy. Optimize and update logical and physical data models to support new and existing projects. Maintain logical and physical data models along with corresponding metadata. Develop best practices for standard naming conventions and coding practices to ensure data model consistency. Recommend opportunities for data model reuse in new environments. Perform reverse engineering of physical data models from databases and SQL scripts. Evaluate data models and physical databases for variances and discrepancies. Validate business data objects for accuracy and completeness. Analyze data-related system integration challenges and propose appropriate solutions. Develop data models according to company standards. Guide System Analysts, Engineers, Programmers, and others on project limitations and capabilities, performance requirements, and interfaces. Review modifications to existing data models to improve efficiency and performance. Examine new application design and recommend corrections as needed. #IncludingYou Diversity, equity, inclusion, and belonging are cornerstones of ADM's efforts to continue innovating, driving growth, and delivering outstanding performance. ADM is committed to attracting and retaining a diverse workforce and creating welcoming, inclusive work environments that enable every ADM colleague to feel comfortable, make meaningful contributions, and grow their career. ADM values the unique backgrounds and experiences that each person brings to the organization, understanding that diversity of perspectives makes us stronger together. For more information regarding ADM's efforts to advance Diversity, Equity, Inclusion & Belonging, please visit the website: Diversity, Equity and Inclusion | ADM. About ADM At ADM, the power of nature is unlocked to provide access to nutrition worldwide. With industry-advancing innovations, a comprehensive portfolio of ingredients and solutions catering to diverse tastes, and a commitment to sustainability, ADM offers customers an edge in addressing nutritional challenges. As a global leader in human and animal nutrition and the premier agricultural origination and processing company worldwide, ADM's capabilities in insights, facilities, and logistical expertise are unparalleled. From ideation to solution, ADM enriches the quality of life globally. Learn more at www.adm.com.,
Posted 2 days ago
6.0 - 10.0 years
0 Lacs
indore, madhya pradesh
On-site
You should have 6-8 years of hands-on experience with Big Data technologies such as pySpark (Data frame and SparkSQL), Hadoop, and Hive. Additionally, you should possess good hands-on experience with python and Bash Scripts, along with a solid understanding of SQL and data warehouse concepts. Strong analytical, problem-solving, data analysis, and research skills are crucial for this role. It is essential to have a demonstrable ability to think creatively and independently, beyond relying solely on readily available tools. Excellent communication, presentation, and interpersonal skills are a must for effective collaboration within the team. Hands-on experience with Cloud Platform provided Big Data technologies like IAM, Glue, EMR, RedShift, S3, and Kinesis is required. Experience in orchestrating with Airflow and any job scheduler is highly beneficial. Familiarity with migrating workloads from on-premise to cloud and cloud to cloud migrations is also desired. In this role, you will be responsible for developing efficient ETL pipelines based on business requirements while adhering to development standards and best practices. Integration testing of different pipelines in AWS environment and providing estimates for development, testing, and deployments on various environments will be part of your responsibilities. Participation in code peer reviews to ensure compliance with best practices is essential. Creating cost-effective AWS pipelines using necessary AWS services like S3, IAM, Glue, EMR, Redshift, etc., is a key aspect of this position. Your experience should range from 6 to 8 years in relevant fields. The job reference number for this position is 13024.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
As an Analyst Programmer at WPFH department in Gurgaon, you will be an integral part of the Data team responsible for building data integration and distribution solutions within the Distribution Data and Reporting team. Your role will involve working closely with technical leads, business analysts, and product teams to design, develop, and troubleshoot ETL jobs for various Operational data stores. You will be expected to demonstrate innovative problem-solving skills, strong interpersonal abilities, and a high level of ownership in a dynamic working environment. Key Responsibilities: - Collaborate with Technical leads, Business Analysts, and subject matter experts. - Design and develop ETL jobs based on data model requirements. - Utilize Informatica Power Centre tool set and Oracle database for ETL job development. - Provide development estimates and ensure adherence to standards and best practices. - Coordinate dependencies and deliverables with cross-functional teams. Essential Skills: Technical: - Minimum 3 years of experience using Informatica Power Centre tool set. - Proficiency in Snowflake, Source Control Tools, Control-M, UNIX scripting, and SQL/Pl-SQL. - Experience in Data Warehouse, Datamart, ODS concepts, and Oracle/SQL Server utilities. - Knowledge of data normalization, OLAP, and Oracle performance optimization. Functional: - Minimum 3 years of experience in financial organizations with broad base business process knowledge. - Strong communication, interpersonal, and client-facing skills. - Ability to work closely with cross-functional teams and data stewards. About You: - Bachelor's Degree in B.E./B.Tech/MBA/M.C.A or equivalent. - Minimum 3 years of experience in Data Integration and Distribution. - Experience in building web services and APIs. - Knowledge of Agile software development methodologies. At our organization, we value your wellbeing, support your development, and offer a comprehensive benefits package. We prioritize a flexible work environment that promotes a healthy work-life balance. Join our team at WPFH and be part of a collaborative environment where you can contribute to building better financial futures. Explore more about our dynamic working approach and future opportunities at careers.fidelityinternational.com.,
Posted 2 days ago
15.0 - 21.0 years
0 Lacs
haryana
On-site
The Data Architecture Specialist Join our team of data architects who design and execute industry-relevant reinventions that allow organizations to realize exceptional business value from technology. You will be a part of the Technology Strategy & Advisory practice at Accenture, operating at the level of Sr Manager in locations like Bangalore, Mumbai, Pune, and Gurugram. With 15 to 21 years of experience, you will explore exciting opportunities to contribute to Accenture's global initiatives in Technology Strategy & Advisory. As a Data Architecture Specialist, you will collaborate with clients to leverage data, architecture, and AI to drive business agility and real-time transformation. Your responsibilities will include: - Conducting in-depth analysis to identify and solve complex business problems - Guiding clients in designing and scaling technology-driven growth journeys - Facilitating architecture transformations for improved business outcomes - Enabling clients to build capabilities for sustained high performance and innovation To excel in this role, you are expected to: - Present data strategies and technology solutions for C-suite discussions - Demonstrate expertise in technologies like big data, data integration, and cloud platforms - Lead proof of concept implementations and scale them across various domains - Showcase proficiency in data-led projects and RFP responses - Utilize analytical skills in problem-solving and understanding business value drivers - Develop client relationships and manage stakeholder engagements effectively - Collaborate with diverse teams, leveraging leadership and communication skills to achieve common goals If you are a problem solver with a passion for technology-driven transformations and a desire to enhance business architecture performance, Accenture Technology Strategy & Advisory is the perfect place for you to contribute your skills and expertise. Welcome to a dynamic and inclusive culture where innovation and collaboration thrive.,
Posted 2 days ago
10.0 - 14.0 years
0 Lacs
delhi
On-site
As a Partner Solution Engineer at Snowflake, you will play a crucial role in technically onboarding and enabling partners to re-platform their Data and AI applications onto the Snowflake AI Data Cloud. Collaborating with partners to develop Snowflake solutions in customer engagements, you will work with them to create assets and demos, build hands-on POCs, and pitch Snowflake solutions. Additionally, you will assist Solution Providers/Practice Leads with the technical strategies that enable them to sell their offerings on Snowflake. Your responsibilities will include keeping partners up to date on key Snowflake product updates and future roadmaps to help them represent Snowflake to their clients about the latest technology solutions and benefits. Running technical enablement programs to provide best practices and solution design workshops to help partners create effective solutions will also be part of your role. Success in this position will require you to drive strategic engagements by quickly grasping new concepts and articulating their business value. You will showcase the impact of Snowflake through compelling customer success stories and case studies, demonstrating a strong understanding of how partners make revenue through the industry priorities and complexities they face. Preferred skill sets and experiences for this role include having a total of 10+ years of relevant experience, experience working with Tech Partners, ISVs, and System Integrators (SIs) in India, and developing data domain thought leadership within the partner community. You should also have presales or hands-on experience with Data Warehouse, Data Lake, or Lakehouse platforms, as well as experience with partner integration ecosystems like Alation, FiveTran, Informatica, dbtCloud, etc. Having hands-on experience and strong knowledge of Docker and how to containerize Python-based applications, knowledge of Container networking and Kubernetes, and proficiency in Agile development practices and Continuous Integration/Continuous Deployment (CI/CD), including DataOps and MLops are desirable skills. Experience in the AI/ML domain is a plus. Snowflake is rapidly expanding, and as part of the team, you will help enable and accelerate the company's growth. If you share Snowflake's values, challenge ordinary thinking, and push the pace of innovation while building a future for yourself and Snowflake, this role could be the perfect fit for you. Please visit the Snowflake Careers Site for salary and benefits information if the job is located in the United States.,
Posted 2 days ago
7.0 - 11.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Senior Healthcare Business Analyst at CitiusTech, you will be a part of an Agile team designing and building healthcare applications, implementing new features, and ensuring adherence to the best coding development standards. Your responsibilities will include delivering technical preliminary design documents, conducting detailed analysis of data systems to solve complex business problems in an agile environment, providing consulting support for IT and Business partners, meeting defined deadlines with a high level of quality, creating system test plans and test data, participating in deliverables required by approved Development Lifecycles, creating various types of documentation, performing testing, and adhering to IT and corporate policies, procedures, and standards. With 7-8 years of experience, you will be based in either Mumbai, Pune, or Chennai. An Engineering Degree (BE/ME/BTech/MTech/BSc/MSc) and technical certification in multiple technologies are required. Relevant industry-recognized certifications related to project management such as CSPO, PMP, Agile PM, SAFe are desirable. Mandatory technical skills include US Healthcare domain knowledge, strong SQL knowledge, experience in data warehouse and data management projects, collaboration with DBA and DB developers, working on creating BRD, FRDs, UML, and flow diagrams, facilitating business requirement elicitation sessions, identifying potential issues and risks, and more. Good attitude, experience in Agile model, excellent communication skills, and adherence to departmental policies and procedures are essential. Good to have skills include experience as a Development/Data Analyst, Data Warehousing, working with tools like Microsoft Project, Jira, and Confluence, strategic thinking, and knowledge of vulnerability and security domain. CitiusTech is committed to combining IT services, consulting, products, accelerators, and frameworks with a client-first mindset and next-gen tech understanding to humanize healthcare and make a positive impact on human lives. The company values Passion, Respect, Openness, Unity, and Depth (PROUD) of knowledge, creating a fun, transparent, non-hierarchical, diverse work culture focused on continuous learning and work-life balance. Rated as a Great Place to Work, CitiusTech offers comprehensive benefits to ensure a long and rewarding career. The EVP "Be You Be Awesome" reflects the company's efforts to create a great workplace supporting employee growth, well-being, and success. By collaborating with global leaders at CitiusTech, you will have the opportunity to shape the future of healthcare and positively impact human lives.,
Posted 2 days ago
7.0 - 11.0 years
0 Lacs
haryana
On-site
You will be working as a Database Administrator (DBA) - Modernization Lead with a minimum of 5 years of experience. The job location can be Mohali, Panchkula, Bangalore, Pune, Mumbai, or Gurgaon with night shifts (US Timings) on a rotational basis from 8 PM to 6 AM. The interviews are scheduled for different dates and locations. The ideal candidate should have good communication skills and an AI cleared profile. As the DBA Modernization Lead, your primary responsibility will be to lead the overhaul of the existing SQL Server database to a modern, scalable SQL, PostgreSQL, or other scalable backend on AWS. This role will involve designing and implementing the new database architecture, ensuring data integrity, optimizing performance, and supporting the development team in transitioning old business logic to the new platform. Key Responsibilities: - Lead the design and implementation of the new database architecture on AWS. - Migrate data from SQL Server to SQL or PostgreSQL. - Optimize database performance and ensure data integrity. - Collaborate with the Architect and development teams to translate old business logic into the new platform. - Implement data security measures and ensure compliance with industry standards. - Support data ingestion components and ensure efficient data processing. - Provide guidance and mentoring to junior DBAs and developers. Core Requirements: - Bachelor's degree in Computer Science, Information Technology, or a related field. - 7+ years of experience as a DBA, focusing on database design, migration, and optimization. - Expertise in SQL Server, SQL, PostgreSQL, and AWS database services. - Strong understanding of data modeling, ETL processes, and data warehousing. - Experience with Entity Framework and T-SQL. - Knowledge of database security best practices. - Excellent problem-solving skills and attention to detail. - Strong communication and collaboration skills. Skills required for this role include AWS, SQL Server, database management, PostgreSQL, ETL processes, data warehousing, and Entity Framework.,
Posted 3 days ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
You will be responsible for working as an AWS Data Engineer at YASH Technologies. Your role will involve performing tasks related to data collection, processing, storage, and integration. It is essential to have proficiency in data Extract-Transform-Load (ETL) processes, data pipeline setup, as well as knowledge of database and data warehouse technologies on the AWS cloud platform. Prior experience in handling timeseries and unstructured data types, such as image data, is a necessary requirement for this position. Additionally, you should have experience in developing data analytics software on the AWS cloud, either as a full-stack or back-end developer. Skills in software quality assessment, testing, and API integration are also crucial for this role. Working at YASH, you will have the opportunity to build a career in a supportive and inclusive team environment. The company focuses on continuous learning and growth by providing career-oriented skilling models and utilizing technology for upskilling and reskilling activities. You will be part of a Hyperlearning workplace that is grounded on the principles of flexible work arrangements, emotional positivity, self-determination, trust, transparency, open collaboration, and providing support for achieving business goals. YASH Technologies offers stable employment with a great atmosphere and an ethical corporate culture.,
Posted 3 days ago
4.0 - 8.0 years
0 Lacs
delhi
On-site
As a Tester in the financial services industry, you will be an integral part of an agile team, driving user story analysis, feature grooming, designing, and developing comprehensive test scripts. Your responsibilities will include writing complex SQL queries against large datasets in AWS, developing and maintaining BDD test scenarios, and regression plans. You will also participate in the test development life cycle, including requirements analysis and design. Your role will involve business intelligence testing, validating DataMart, ODS, data models, and SSRS reports. You will need to understand data flow and test strategy for ETL, data warehouse, and business intelligence testing. ETL testing of mapping, transformations, and data pipeline will be a crucial aspect of your responsibilities. Additionally, you will work with the team to enhance test processes and practices continually, ensuring adherence to standards within the project team. To excel in this role, you must possess excellent hands-on PC and organizational skills, familiarity with advanced features in MS Word and MS PPT, and the ability to work with complex spreadsheets and embedded formulas. Exposure to VBA macro development within MS Excel is essential. You should have an understanding of software QA/QE methodologies, tools, and processes, along with experience in manual functional testing and automation scripting. Experience with defect management applications like Jira and xRay, knowledge of DEVOPS, continuous integration, continuous development environments, and the ability to design, develop, debug, and execute automation scripts are necessary qualifications. Hands-on experience in test automation frameworks using tools like Alteryx, Selenium, Java, or Python is preferred. Understanding SQL, writing SQL queries, and comprehending data retrieval, formatting, and integration are crucial skills for this role. Your solid analytical, quantitative, and problem-solving skills will enable you to interpret data effectively, reach conclusions, and take appropriate actions. Strong communication skills are essential for conveying technology-related information clearly to different audiences and detailing implementation processes. Leadership competencies, cross-collaboration skills, and workflow facilitation with internal business partners are key to your success in this role. A bachelor's degree or equivalent work experience is required, along with at least 4 years of experience in the financial services industry as a Tester. Experience in functional testing, integration testing, regression testing, system testing, end-to-end testing, and acceptance testing is preferred. Familiarity with Alteryx, strong oral and written communication skills, including presentation skills, and experience working with agile and scrum methodology are advantageous qualities. If you are proactive, responsive, and thrive in a fast-paced changing environment, this role offers an opportunity to leverage your expertise and contribute significantly to the success of the team and the organization.,
Posted 3 days ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As an Associate Technical Product Analyst - Global Data & Analytics Platform at McDonald's Corporation in Hyderabad, you will be an integral part of the Global Technology Enterprise Products & Platforms (EPP) Team. In this role, you will focus on data management & operations within the Global Data & Analytics Platform (GDAP) to support integrations with core Corporate Accounting/Financial/Reporting applications. Your vision will align with McDonald's goal to be a people-led, product-centric, forward-thinking, and trusted technology partner. Your responsibilities will include supporting the Technical Product Management leadership in technical/IT-related delivery topics such as trade-offs in implementation approaches and tech stack selection. You will provide technical guidance for developers/squad members, manage the output of internal/external squads to ensure adherence to McDonald's standards, participate in roadmap and backlog preparation, and maintain technical process flows and solution architecture diagrams at the product level. Additionally, you will lead acceptance criteria creation, validate development work, support hiring and development of engineers, and act as a technical developer as needed. To excel in this role, you should possess a Bachelor's degree in computer science or engineering, along with at least 3 years of hands-on experience designing and implementing solutions using AWS RedShift and Talend. Experience in data warehouse is a plus, as is familiarity with accounting and financial solutions across different industries. Knowledge of Agile software development processes, collaborative problem-solving skills, and excellent communication abilities are essential for success in this position. Preferred qualifications include proficiency in SQL, data integration tools, and scripting languages, as well as a strong understanding of Talend, AWS Redshift, and other AWS services. Experience with RESTful APIs, microservices architecture, DevOps practices, and tools like Jenkins and GitHub is highly desirable. Additionally, foundational expertise in security standards, cloud architecture, and Oracle cloud security will be advantageous. This full-time role based in Hyderabad, India, offers a hybrid work mode. If you are a detail-oriented individual with a passion for leveraging technology to drive business outcomes and are eager to contribute to a global team dedicated to innovation and excellence, we invite you to apply for the position of Associate Technical Product Analyst at McDonald's Corporation.,
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
coimbatore, tamil nadu
On-site
As a Senior Data Engineer at our organization, you will play a crucial role in the Data Engineering team within the Enterprise Data & Analytics organization. Your primary responsibility will be to design, build, and maintain both batch and real-time data pipelines that cater to the needs of our enterprise, analyst communities, and downstream systems. Collaboration with data architects is essential to ensure that the data engineering solutions align with the long-term architecture objectives. You will be tasked with maintaining and optimizing the data infrastructure to facilitate accurate data extraction, transformation, and loading from diverse data sources. Developing ETL processes will be a key part of your role to extract and manipulate data effectively. Ensuring data accuracy, integrity, privacy, security, and compliance will be a top priority, and you will need to follow quality control procedures and adhere to SOX compliance standards. Monitoring data systems performance, implementing optimization strategies, improving operational practices and metrics, and mentoring junior engineers will also be part of your responsibilities. To be successful in this role, you should possess a Bachelor's degree in Computer Science, Information Systems, or a related field, along with a minimum of 5+ years of relevant experience in data engineering. Experience with cloud Data Warehouse solutions (such as Snowflake) and Cloud-based solutions (e.g., AWS, Azure, GCP), as well as exposure to Salesforce or any CRM system, will be beneficial. Proficiency in Advanced SQL, relational databases, database design, large data sets, distributed computing (Spark/Hive/Hadoop), object-oriented languages (Python, Java), scripting languages, data pipeline tools (Airflow), and agile methodology is required. Your problem-solving, communication, organizational skills, ability to work independently and collaboratively, self-starting attitude, stakeholder communication skills, and quick learning and adaptability will be crucial for excelling in this role. By following best practices, standards, and contributing to the maturity of data engineering practices, you will be instrumental in driving business transformation through data.,
Posted 4 days ago
1.0 - 5.0 years
0 Lacs
karnataka
On-site
The EMEA and APAC Mortgage desk at TMM-AFT makes markets in EMEA/APAC loan business (Resi/consumer) and Asset Backed securities, with a focus on purchasing whole loan pools, originating/financing new loan assets (CRE/Resi/consumer), and creating new securities backed by loan assets for distribution to clients. As an Analyst supporting the desk, you will be responsible for asset management and information solutions to facilitate acquisition, monitoring, reporting, and disposition/securitization of loans. Your responsibilities will include managing financing facilities, decoding legal documents into Excel models, overseeing client relationships and warehouse deals, handling mortgage and consumer loan data, monitoring collateral adequacy, tracking deliverables, and collaborating with IT on data warehouse and reporting projects. You will ensure data accuracy, coordinate with internal departments on database enhancements, develop reporting solutions, and communicate project results to various business groups. The basic qualifications for this role include 1-4 years of experience in mortgages or consumer portfolio/collateral analytics/asset management, a strong academic background in finance, business, math, or accounting, with excellent communication and analytical skills. Proficiency in SQL, RDBMS Databases (SQL Server or Sybase ASE), data reporting, and data visualizations is required, with Tableau experience considered a plus. Strong project management and stakeholder management skills, along with the ability to work under tight deadlines, prioritize workload, and collaborate effectively within a team are essential. Join the dynamic environment of the EMEA/APAC Mortgage desk and contribute to shaping the future of banking and capital markets by leveraging your expertise and skills in asset management and data analytics.,
Posted 6 days ago
5.0 - 10.0 years
20 - 25 Lacs
Bengaluru
Hybrid
Job title: Senior Software Engineer Experience: 5- 8 years Primary skills: Python, Spark or Pyspark, DWH ETL. Database: SparkSQL or PostgreSQL Secondary skills: Databricks ( Delta Lake, Delta tables, Unity Catalog) Work Model: Hybrid (Weekly Twice) Cab Facility: Yes Work Timings: 10am to 7pm Interview Process: 3 rounds (3rd round F2F Mandatory) Work Location: Karle Town Tech Park Nagawara, Hebbal Bengaluru 560045 About Business Unit: The Architecture Team plays a pivotal role in the end-to-end design, governance, and strategic direction of product development within Epsilon People Cloud (EPC). As a centre of technical excellence, the team ensures that every product feature is engineered to meet the highest standards of scalability, security, performance, and maintainability. Their responsibilities span across architectural ownership of critical product features, driving techno-product leadership, enforcing architectural governance, and ensuring systems are built with scalability, security, and compliance in mind. They design multi cloud and hybrid cloud solutions that support seamless integration across diverse environments and contribute significantly to interoperability between EPC products and the broader enterprise ecosystem. The team fosters innovation and technical leadership while actively collaborating with key partners to align technology decisions with business goals. Through this, the Architecture Team ensures the delivery of future-ready, enterprise-grade, efficient and performant, secure and resilient platforms that form the backbone of Epsilon People Cloud. Why we are looking for you: You have experience working as a Data Engineer with strong database fundamentals and ETL background. You have experience working in a Data warehouse environment and dealing with data volume in terabytes and above. You have experience working in relation data systems, preferably PostgreSQL and SparkSQL. You have excellent designing and coding skills and can mentor a junior engineer in the team. You have excellent written and verbal communication skills. You are experienced and comfortable working with global clients You work well with teams and are able to work with multiple collaborators including clients, vendors and delivery teams. You are proficient with bug tracking and test management toolsets to support development processes such as CI/CD. What you will enjoy in this role: As part of the Epsilon Technology practice, the pace of the work matches the fast-evolving demands in the industry. You will get to work on the latest tools and technology and deal with data of petabyte-scale. Work on homegrown frameworks on Spark and Airflow etc. Exposure to Digital Marketing Domain where Epsilon is a marker leader. Understand and work closely with consumer data across different segments that will eventually provide insights into consumer behaviour's and patterns to design digital Ad strategies. As part of the dynamic team, you will have opportunities to innovate and put your recommendations forward. Using existing standard methodologies and defining as per evolving industry standards. Opportunity to work with Business, System and Delivery to build a solid foundation on Digital Marketing Domain. The open and transparent environment that values innovation and efficiency Click here to view how Epsilon transforms marketing with 1 View, 1 Vision and 1 Voice. What will you do? Develop a deep understanding of the business context under which your team operates and present feature recommendations in an agile working environment. Lead, design and code solutions on and off database for ensuring application access to enable data-driven decision making for the company's multi-faceted ad serving operations. Working closely with Engineering resources across the globe to ensure enterprise data warehouse solutions and assets are actionable, accessible and evolving in lockstep with the needs of the ever-changing business model. This role requires deep expertise in spark and strong proficiency in ETL, SQL, and modern data engineering practices. Design, develop, and manage ETL/ELT pipelines in Databricks using PySpark/SparkSQL, integrating various data sources to support business operations Lead in the areas of solution design, code development, quality assurance, data modelling, business intelligence. Mentor Junior engineers in the team. Stay abreast of developments in the data world in terms of governance, quality and performance optimization. Able to have effective client meetings, understand deliverables, and drive successful outcomes. Qualifications: Bachelor's Degree in Computer Science or equivalent degree is required. 5 - 8 years of data engineering experience with expertise using Apache Spark and Databases (preferably Databricks) in marketing technologies and data management, and technical understanding in these areas. Monitor and tune Databricks workloads to ensure high performance and scalability, adapting to business needs as required. Solid experience in Basic and Advanced SQL writing and tuning. Experience with Python Solid understanding of CI/CD practices with experience in Git for version control and integration for spark data projects. Good understanding of Disaster Recovery and Business Continuity solutions Experience with scheduling applications with complex interdependencies, preferably Airflow Good experience in working with geographically and culturally diverse teams. Understanding of data management concepts in both traditional relational databases and big data lakehouse solutions such as Apache Hive, AWS Glue or Databricks. Excellent written and verbal communication skills. Ability to handle complex products. Good communication and problem-solving skills, with the ability to manage multiple priorities. Ability to diagnose and solve problems quickly. Diligent, able to multi-task, prioritize and able to quickly change priorities. Good time management. Good to have knowledge of cloud platforms (cloud security) and familiarity with Terraform or other infrastructure-as-code tools. About Epsilon: Epsilon is a global data, technology and services company that powers the marketing and advertising ecosystem. For decades, we have provided marketers from the world's leading brands the data, technology and services they need to engage consumers with 1 View, 1 Vision and 1 Voice. 1 View of their universe of potential buyers. 1 Vision for engaging each individual. And 1 Voice to harmonize engagement across paid, owned and earned channels. Epsilon's comprehensive portfolio of capabilities across our suite of digital media, messaging and loyalty solutions bridge the divide between marketing and advertising technology. We process 400+ billion consumer actions each day using advanced AI and hold many patents of proprietary technology, including real-time modeling languages and consumer privacy advancements. Thanks to the work of every employee, Epsilon has been consistently recognized as industry-leading by Forrester, Adweek and the MRC. Epsilon is a global company with more than 9,000 employees around the world.
Posted 6 days ago
5.0 - 10.0 years
20 - 30 Lacs
Chennai
Work from Office
• Experience in cloud-based systems (GCP, BigQuery) • Strong SQL programming skills. • Expertise in database programming and performance tuning techniques • Possess knowledge of data warehouse architectures, ETL, reporting/analytic tools,
Posted 6 days ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
As an associate architect in the SAP Datasphere team at SAP, you will be a key contributor to the comprehensive end-to-end data management and decision-making cloud solution built on SAP Business Technology Platform (SAP BTP). Your role will involve working with a highly collaborative team environment dedicated to delivering seamless and scalable access to mission-critical business data. You will be responsible for various aspects including data integration, data cataloguing, semantic modelling, data warehousing, data federation, and data virtualization. SAP Datasphere empowers data professionals to distribute business data across the data landscape while preserving business context and logic effectively. SAP is a global leader in providing innovative solutions that help over four hundred thousand customers worldwide to work more efficiently and gain valuable business insights. With a rich history in enterprise resource planning (ERP) software, SAP has transformed into a market leader offering end-to-end business application software, database services, analytics, intelligent technologies, and experience management. As a cloud company with over two hundred million users and a diverse workforce of over one hundred thousand employees, we are committed to a purpose-driven and future-focused approach. Our organizational culture is deeply collaborative, emphasizing personal development and individual contributions. At SAP, you will have the opportunity to showcase your talents and bring out your best. SAP values inclusivity, prioritizes the health and well-being of its employees, and offers flexible working models to ensure that everyone, regardless of background, feels included and empowered to perform at their best. We believe in the strength of diversity and invest in our employees to foster confidence and enable them to reach their full potential. By embracing the unique capabilities and qualities of each individual, we aim to unleash all talent and contribute to building a better and more equitable world. SAP is an equal opportunity workplace and an affirmative action employer. We uphold the principles of Equal Employment Opportunity and provide accessibility accommodations for applicants with physical and/or mental disabilities. If you require assistance or special accommodations while navigating our website or completing your application, please reach out to the Recruiting Operations Team at Careers@sap.com. For SAP employees interested in the SAP Employee Referral Program, please note that only permanent roles are eligible according to the rules outlined in the SAP Referral Policy, with specific conditions applicable to Vocational Training roles. SAP is proud to be an advocate for diversity and inclusion, and we are committed to creating a workplace where every individual can thrive. Join us in our mission to help the world run better, together.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
punjab
On-site
As a GCP Data Engineer in Australia, you will be responsible for leveraging your experience in Google Cloud Platform (GCP) to handle various aspects of data engineering. Your role will involve working on data migration projects from legacy systems such as SQL and Oracle. You will also be designing and building ETL pipelines for data lake and data warehouse solutions on GCP. In this position, your expertise in GCP data and analytics services will be crucial. You will work with tools like Cloud Dataflow, Cloud Dataprep, Apache Beam/Cloud composer, Cloud BigQuery, Cloud Fusion, Cloud PubSub, Cloud storage, and Cloud Functions. Additionally, you will utilize Cloud Native GCP CLI/gsutil for operations and scripting languages like Python and SQL to enhance data processing efficiencies. Furthermore, your experience with data governance practices, metadata management, data masking, and encryption will be essential. You will utilize GCP tools such as Cloud Data Catalog and GCP KMS tools to ensure data security and compliance. Overall, this role requires a strong foundation in GCP technologies and a proactive approach to data engineering challenges in a dynamic environment.,
Posted 1 week ago
18.0 - 22.0 years
0 Lacs
noida, uttar pradesh
On-site
This is a senior leadership position within the Business Information Management Practice, where you will be responsible for the overall vision, strategy, delivery, and operations of key accounts in BIM. You will work closely with the global executive team, subject matter experts, solution architects, project managers, and client teams to conceptualize, build, and operate Big Data Solutions. Your role will involve communicating with internal management, client sponsors, and senior leaders on project status, risks, solutions, and more. As a Client Delivery Leadership Role, you will be accountable for delivering at least $10 M + revenue using information management solutions such as Big Data, Data Warehouse, Data Lake, GEN AI, Master Data Management System, Business Intelligence & Reporting solutions, IT Architecture Consulting, Cloud Platforms (AWS/AZURE), and SaaS/PaaS based solutions. In addition, you will play a crucial Practice and Team Leadership Role, exhibiting qualities like self-driven initiative, customer focus, problem-solving skills, learning agility, ability to handle multiple projects, excellent communication, and leadership skills to coach and mentor staff. As a qualified candidate, you should hold an MBA in Business Management and a Bachelor of Computer Science. You should have 18+ years of prior experience, preferably including at least 5 years in the Pharma Commercial domain, delivering customer-focused information management solutions. Your skills should encompass successful end-to-end DW implementations using technologies like Big Data, Data Management, and BI technologies. Leadership qualities, team management experience, communication skills, and hands-on knowledge of databases, SQL, and reporting solutions are essential. Preferred skills include teamwork, leadership, motivation to learn and grow, ownership, cultural fit, talent management, and capability building/thought leadership. As part of Axtria, a global provider of cloud software and data analytics to the Life Sciences industry, you will contribute to transforming the product commercialization journey to drive sales growth and improve healthcare outcomes for patients. Axtria values technology innovation and offers a transparent and collaborative culture with opportunities for training, career progression, and meaningful work in a fun environment. If you are a driven and experienced professional with a passion for leadership in information management technology and the Pharma domain, this role offers a unique opportunity to make a significant impact and grow within a dynamic and innovative organization.,
Posted 1 week ago
2.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
The role requires you to lead the design and development of Global Supply Chain Analytics applications and provide support for applications from other domains using supply chain data. You will be responsible for hands-on management of applications in Supply Chain Analytics and wider Operations domain. As a senior specialist in Supply Chain Data & Analytics, you will drive the deliverables for important digital initiatives contributing towards strategic priorities. Your role will involve leading multiple projects and digital products, collaborating with team members both internally and externally, and interacting with global Business and IT stakeholders to ensure successful solution delivery with standard designs in line with industry best practices. Your responsibilities will include designing and managing the development of modular, reusable, elegantly designed, and maintainable software solutions that support the Supply Chain organization and other Cross Functional strategic initiatives. You will participate in fit-gap workshops with business stakeholders, provide effort estimates and solutions proposals, and develop and maintain code repositories while responding rapidly to bug reports or security vulnerability issues. Collaboration with colleagues across various departments such as Security, Compliance, Engineering, Project Management, and Product Management will be essential. You will also drive data enablement and build digital products, delivering solutions aligned with business prioritizations and in coordination with technology architects. Contributing towards AI/ML initiatives, data quality improvement, business process simplification, and other strategic pillars will be part of your role. Ensuring that delivered solutions adhere to architectural and development standards, best practices, and meet requirements as recommended in the architecture handbook will be crucial. You will also be responsible for aligning designed solutions with Data and Analytics strategy standards and roadmap, as well as providing status reporting to product owners and IT management. To be successful in this role, you should have a minimum of 8 years of data & analytics experience in a professional environment, with expertise in building applications across platforms. Additionally, you should have experience in delivery management, customer-facing IT roles, Machine Learning, SAP BW on HANA and/or S/4 HANA, and cloud platforms. Strong data engineering fundamentals in data management, data analysis, and back-end system design are required, along with hands-on exposure in Data & Analytics solutions, including predictive and prescriptive analytics. Key skills for this role include collecting and interpreting requirements, understanding Supply Chain business processes and KPIs, domain expertise in Pharma industry and/or Healthcare, excellent communication and problem-solving skills, knowledge in Machine Learning and analytical tools, familiarity with Agile and Waterfall delivery concepts, proficiency in using various tools such as Jira, Confluence, GitHub, and SAP Solution Manager, and hands-on experience in technologies like AWS Services, Python, Power BI, SAP Analytics, and more. Additionally, the ability to learn new technologies and functional topics quickly is essential. Novartis is committed to building an outstanding, inclusive work environment and diverse teams representative of the patients and communities it serves. If you are passionate about making a difference in the lives of others and are ready to collaborate, support, and inspire breakthroughs, this role offers an opportunity to create a brighter future together.,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough