Jobs
Interviews

954 Olap Jobs - Page 26

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Role: Data Engineer-ODI(Oracle Data Integrator) Experience: 3 to 8 Years Location: All LTIMindtree Office Locations Job Description 3 to 8 years IT experience in development and implementation of Business Intelligence and Data warehousing solutions using ODI. Knowledge of Analysis Design, Development, Customization, Implementation Maintenance of Oracle Data Integrator (ODI). Experience in Designing, implementing, and maintaining ODI load plan and process. Working knowledge of ODI, PL/SQL, TOAD Data Modelling logical / Physical, Star/Snowflake, Schema, FACT Dimensions tables, ELT,OLAP. Experience with SQL, UNIX, complex queries, Stored Procedures and Data Warehouse best practices. Ensure correctness and completeness of Data loading (Full load Incremental load). Excellent communication skills, organized and effective in delivering high-quality solutions using ODI. This job is provided by Shine.com Show more Show less

Posted 1 month ago

Apply

3.0 - 8.0 years

0 Lacs

Greater Kolkata Area

On-site

Role: Data Engineer-ODI(Oracle Data Integrator) Experience: 3 to 8 Years Location: All LTIMindtree Office Locations Job Description 3 to 8 years IT experience in development and implementation of Business Intelligence and Data warehousing solutions using ODI. Knowledge of Analysis Design, Development, Customization, Implementation Maintenance of Oracle Data Integrator (ODI). Experience in Designing, implementing, and maintaining ODI load plan and process. Working knowledge of ODI, PL/SQL, TOAD Data Modelling logical / Physical, Star/Snowflake, Schema, FACT Dimensions tables, ELT,OLAP. Experience with SQL, UNIX, complex queries, Stored Procedures and Data Warehouse best practices. Ensure correctness and completeness of Data loading (Full load Incremental load). Excellent communication skills, organized and effective in delivering high-quality solutions using ODI. This job is provided by Shine.com Show more Show less

Posted 1 month ago

Apply

3.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Role: Data Engineer-ODI(Oracle Data Integrator) Experience: 3 to 8 Years Location: All LTIMindtree Office Locations Job Description 3 to 8 years IT experience in development and implementation of Business Intelligence and Data warehousing solutions using ODI. Knowledge of Analysis Design, Development, Customization, Implementation Maintenance of Oracle Data Integrator (ODI). Experience in Designing, implementing, and maintaining ODI load plan and process. Working knowledge of ODI, PL/SQL, TOAD Data Modelling logical / Physical, Star/Snowflake, Schema, FACT Dimensions tables, ELT,OLAP. Experience with SQL, UNIX, complex queries, Stored Procedures and Data Warehouse best practices. Ensure correctness and completeness of Data loading (Full load Incremental load). Excellent communication skills, organized and effective in delivering high-quality solutions using ODI. This job is provided by Shine.com Show more Show less

Posted 1 month ago

Apply

3.0 - 8.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Role: Data Engineer-ODI(Oracle Data Integrator) Experience: 3 to 8 Years Location: All LTIMindtree Office Locations Job Description 3 to 8 years IT experience in development and implementation of Business Intelligence and Data warehousing solutions using ODI. Knowledge of Analysis Design, Development, Customization, Implementation Maintenance of Oracle Data Integrator (ODI). Experience in Designing, implementing, and maintaining ODI load plan and process. Working knowledge of ODI, PL/SQL, TOAD Data Modelling logical / Physical, Star/Snowflake, Schema, FACT Dimensions tables, ELT,OLAP. Experience with SQL, UNIX, complex queries, Stored Procedures and Data Warehouse best practices. Ensure correctness and completeness of Data loading (Full load Incremental load). Excellent communication skills, organized and effective in delivering high-quality solutions using ODI. This job is provided by Shine.com Show more Show less

Posted 1 month ago

Apply

7.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

About The Role We are seeking an experienced Senior Backend Developer with strong expertise in Java, Spring framework, and high availability service design. This role will be pivotal in designing, developing, and optimizing robust backend systems that power our index and product generation platforms while providing technical leadership within the team. You'll be joining a dynamic team focused on solving complex challenges in delivering near real-time financial data with high throughput and resiliency requirements. About The Team This is an excellent opportunity to join the Index IT team, as part of a delivery-focused IT group responsible for designing, developing and supporting internal, client and public-facing distribution solutions. If selected, you will work as part of a delivery focused and talented software development team responsible for designing, developing and supporting the index and product generation platforms. You will use cutting edge software development techniques and technologies, following the best practices of the industry. Our team solves challenging problems around delivering near real-time financial data, working with large flexible schemas and building database systems that provide exceptional throughput and resiliency. We leverage the latest technologies including Kubernetes, continuous integration/deployment pipelines, and build highly observable applications. MSCI provides a very attractive compensation package, an exciting work environment and opportunities for continuous self-development and career advancement for the right candidates. Key Responsibilities Design, develop, and maintain scalable, high-performance backend applications using Java and Spring framework Lead the architecture and implementation of complex API services that interact with high availability database systems Develop solutions for processing and delivering near real-time financial data streams Design flexible schemas that can accommodate evolving financial data requirements Collaborate closely with product managers, business analysts, and other developers to translate business requirements into technical solutions Design and optimize OLAP database interactions for analytical performance and high availability Implement observable applications with comprehensive monitoring and logging Design and develop RESTful APIs following industry best practices Lead code reviews and mentor junior developers on team best practices Participate in the full software development lifecycle from requirements analysis through deployment Troubleshoot and resolve complex production issues in high-throughput systems Evaluate and recommend new technologies and approaches to improve system performance and developer productivity Contribute to technical documentation and system design specifications Preferred Qualifications Master's degree in Computer Science, Software Engineering, or related field Experience with Kubernetes and containerized application deployment Experience with observability frameworks such as OpenTelemetry (OTEL) Proficiency with continuous integration and deployment methodologies (CI/CD) Knowledge of cloud platforms (AWS, Azure, or GCP) Experience with microservices architecture Experience with containerization technologies (Docker) Understanding of DevOps practices Experience with message brokers (Kafka, RabbitMQ) Background in agile development methodologies Experience with test-driven development and automated testing frameworks Familiarity with financial data models and structures Background in financial services or experience with financial data Required Qualifications Bachelor's degree in Computer Science, Information Technology, or related field 7+ years of professional experience in backend software development 5+ years of experience with Java programming and core Java concepts 3+ years of experience with Spring framework (Spring Boot, Spring MVC, Spring Data) Familiarity with OLAP concepts and high availability database design principles Experience building systems that handle large data volumes with high throughput requirements Proficiency in SQL and database optimization techniques Experience with RESTful API design and implementation Solid understanding of design patterns and object-oriented programming Experience with version control systems (Git) Strong problem-solving skills and attention to detail Excellent communication skills to collaborate effectively across teams and explain technical concepts to non-technical stakeholders What We Offer You Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose - to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com Show more Show less

Posted 1 month ago

Apply

4.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

At Cotality, we are driven by a single mission—to make the property industry faster, smarter, and more people-centric. Cotality is the trusted source for property intelligence, with unmatched precision, depth, breadth, and insights across the entire ecosystem. Our talented team of 5,000 employees globally uses our network, scale, connectivity and technology to drive the largest asset class in the world. Join us as we work toward our vision of fueling a thriving global property ecosystem and a more resilient society. Cotality is committed to cultivating a diverse and inclusive work culture that inspires innovation and bold thinking; it's a place where you can collaborate, feel valued, develop skills and directly impact the real estate economy. We know our people are our greatest asset. At Cotality, you can be yourself, lift people up and make an impact. By putting clients first and continuously innovating, we're working together to set the pace for unlocking new possibilities that better serve the property industry. Job Description In India, we operate as Next Gear India Private Limited, a fully-owned subsidiary of Cotality with offices in Kolkata, West Bengal, and Noida, Uttar Pradesh. Next Gear India Private Limited plays a vital role in Cotality's Product Development capabilities, focusing on creating and delivering innovative solutions for the Property & Casualty (P&C) Insurance and Property Restoration industries. While Next Gear India Private Limited operates under its own registered name in India, we are seamlessly integrated into the Cotality family, sharing the same commitment to innovation, quality, and client success. When you join Next Gear India Private Limited, you become part of the global Cotality team. Together, we shape the future of property insights and analytics, contributing to a smarter and more resilient property ecosystem through cutting-edge technology and insights. QA Automation Engineer As a QA Automation Engineer specializing in Data Warehousing, you will play a critical role in ensuring that our data solutions are of the highest quality. You will work closely with data engineers and analysts to develop, implement, and maintain automated testing frameworks for data validation, ETL processes, data quality, and integration. Your work will ensure that data is accurate, consistent, and performs optimally across our data warehouse systems. Responsibilities Develop and Implement Automation Frameworks: Design, build, and maintain scalable test automation frameworks tailored for data warehousing environments. Test Strategy and Execution: Define and execute automated test strategies for ETL processes, data pipelines, and database integration across a variety of data sources. Data Validation: Implement automated tests to validate data consistency, accuracy, completeness, and transformation logic. Performance Testing: Ensure that the data warehouse systems meet performance benchmarks through automation tools and load testing strategies. Collaborate with Teams: Work closely with data engineers, software developers, and data analysts to understand business requirements and design tests accordingly. Continuous Integration: Integrate automated tests into the CI/CD pipelines, ensuring that testing is part of the deployment process. Defect Tracking and Reporting: Use defect-tracking tools (e.g., JIRA) to log and track issues found during automated testing, ensuring that defects are resolved in a timely manner. Test Data Management: Develop strategies for handling large volumes of test data while maintaining data security and privacy. Tool and Technology Evaluation: Stay current with emerging trends in automation testing for data warehousing and recommend tools, frameworks, and best practices. Job Qualifications Requirements and skills At Least 4+ Years Experience Solid understanding of data warehousing concepts (ETL, OLAP, data marts, data vault,star/snowflake schemas, etc.). Proven experience in building and maintaining automation frameworks using tools like Python, Java, or similar, with a focus on database and ETL testing. Strong knowledge of SQL for writing complex queries to validate data, test data pipelines, and check transformations. Experience with ETL tools (e.g., Matillion, Qlik Replicate) and their testing processes. Performance Testing Experience with version control systems like Git Strong analytical and problem-solving skills, with the ability to troubleshoot complex data issues. Strong communication and collaboration skills. Attention to detail and a passion for delivering high-quality solutions. Ability to work in a fast-paced environment and manage multiple priorities. Enthusiastic about learning new technologies and frameworks. Experience With The Following Tools And Technologies Are Desired. QLIK Replicate Matillion ETL Snowflake Data Vault Warehouse Design Power BI Azure Cloud – Including Logic App, Azure Functions, ADF Cotality's Diversity Commitment Cotality is fully committed to employing a diverse workforce and creating an inclusive work environment that embraces everyone’s unique contributions, experiences and values. We offer an empowered work environment that encourages creativity, initiative and professional growth and provides a competitive salary and benefits package. We are better together when we support and recognize our differences. Equal Opportunity Employer Statement Cotality is an Equal Opportunity employer committed to attracting and retaining the best-qualified people available, without regard to race, ancestry, place of origin, colour, ethnic origin, citizenship, creed, sex, sexual orientation, record of offences, age, marital status, family status or disability. Cotality maintains a Drug-Free Workplace. Please apply on our website for consideration. Privacy Policy Global Applicant Privacy Policy By providing your telephone number, you agree to receive automated (SMS) text messages at that number from Cotality regarding all matters related to your application and, if you are hired, your employment and company business. Message & data rates may apply. You can opt out at any time by responding STOP or UNSUBSCRIBING and will automatically be opted out company-wide. Connect with us on social media! Click on the quicklinks below to find out more about our company and associates Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Career Opportunity with Burckhardt Compression Role We are seeking motivated and experienced professional who can effectively contribute to the role deliverables connected with position below. In this position you can actively participate to our growth and make a significant impact in a fast-paced environment as: Position: Data Engineer . Location: Pune. Your Contributions To Organisation's Growth Maintain & develop data platforms based on Microsoft Fabric for Business Intelligence & Databricks for real-time data analytics. Design, implement and maintain standardized production-grade data pipelines using modern data transformation processes and workflows for SAP, MS Dynamics, on-premise or cloud. Develop an enterprise-scale cloud-based Data Lake for business intelligence solutions. Translate business and customer needs into data collection, preparation and processing requirements. Optimize the performance of algorithms developed by Data Scientists. General administration and monitoring of the data platforms. Competencies working with structured & unstructured data. experienced in various database technologies (RDBMS, OLAP, Timeseries, etc.). solid programming skills (Python, SQL, Scala is a plus). experience in Microsoft Fabric (incl. Warehouse, Lakehouse, Data Factory, DataFlow Gen2, Semantic Model) and/or Databricks (Spark). proficient in PowerBI. experienced working with APIs. proficient in security best practices. data centered Azure know-how is a plus (Storage, Networking, Security, Billing). Expertise you have to bring in along with; Bachelor or Master degree in business informatics, computer science, or equal. A background in software engineering (e.g., agile programming, project organization) and experience with human centered design would be desirable. Extensive experience in handling large data sets. Experience working at least 5 years as a data engineer, preferably in an industrial company. Analytical problem-solving skills and the ability to assimilate complex information. Programming experience in modern data-oriented languages (SQL, Python). Experience with Apache Spark and DevOps. Proven ability to synthesize complex data advanced technical skills related to data modelling, data mining, database design and performance tuning. English language proficiency. Special Requirements High quality mindset paired with strong customer orientation, critical thinking, and attention to detail. Understanding of data processing at scale Influence without authority. Willingness to acquire additional system/technical knowledge as needed. Problem solver. Experience to work in an international organization and in multi-cultural teams. Proactive, creative and innovative. We Offer We have a very free culture, inspiring employees to involve in various activities of their interests. Our flexible working models will allow you to combine private interests with work. Employee Connect, Engagement events and feedback culture enhances our reach and gives us an opportunity to continuously improve. Performance and appreciation awards. Sports activities and Klib Library to energize you. We proudly do encourage diversity and inclusion in thoughts and in spirit. A winner of GreenCo Gold and other various ISO certifications, we encourage you to inhibit the same to contribute in a much greener tomorrow! We do aspire to be Great Place to Work soon to provide you an enticing career with us. HR Team Burckhardt Compression India Show more Show less

Posted 1 month ago

Apply

8.0 - 13.0 years

20 - 35 Lacs

Pune, Gurugram, Jaipur

Work from Office

Role & responsibilities Should be very good in Erwin data Modelling. Should have good knowledge about data quality and data catalog. Should have good understanding in data lineage. Should have good understanding of the data modelling in both OLTP and OLAP system. Should have worked in any of the data warehouse as big data architecture. Should be very good in ANSI SQL. Should have good understanding on data visualization. Should be very comfortable and experience in Data analysis. Should have good knowledge in data cataloging tool and data access policy.

Posted 1 month ago

Apply

8.0 - 13.0 years

20 - 35 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Role & responsibilities Should be very good in Erwin data Modelling. Should have good knowledge about data quality and data catalog. Should have good understanding in data lineage. Should have good understanding of the data modelling in both OLTP and OLAP system. Should have worked in any of the data warehouse as big data architecture. Should be very good in ANSI SQL. Should have good understanding on data visualization. Should be very comfortable and experience in Data analysis. Should have good knowledge in data cataloging tool and data access policy.

Posted 1 month ago

Apply

8.0 - 13.0 years

20 - 35 Lacs

Pune, Gurugram, Jaipur

Work from Office

Role & responsibilities Should be very good in Erwin data Modelling. Should have good knowledge about data quality and data catalog. Should have good understanding in data lineage. Should have good understanding of the data modelling in both OLTP and OLAP system. Should have worked in any of the data warehouse as big data architecture. Should be very good in ANSI SQL. Should have good understanding on data visualization. Should be very comfortable and experience in Data analysis. Should have good knowledge in data cataloging tool and data access policy.

Posted 1 month ago

Apply

8.0 - 13.0 years

25 - 35 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Role & responsibilities Should be very good in Erwin data Modelling. Should have good knowledge about data quality and data catalog. Should have good understanding in data lineage. Should have good understanding of the data modelling in both OLTP and OLAP system. Should have worked in any of the data warehouse as big data architecture. Should be very good in ANSI SQL. Should have good understanding on data visualization. Should be very comfortable and experience in Data analysis. Should have good knowledge in data cataloging tool and data access policy.

Posted 1 month ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

Pune, Gurugram, Jaipur

Work from Office

Role & responsibilities Should be very good in Erwin data Modelling. Should have good knowledge about data quality and data catalog. Should have good understanding in data lineage. Should have good understanding of the data modelling in both OLTP and OLAP system. Should have worked in any of the data warehouse as big data architecture. Should be very good in ANSI SQL. Should have good understanding on data visualization. Should be very comfortable and experience in Data analysis. Should have good knowledge in data cataloging tool and data access policy.

Posted 1 month ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Role & responsibilities Should be very good in Erwin data Modelling. Should have good knowledge about data quality and data catalog. Should have good understanding in data lineage. Should have good understanding of the data modelling in both OLTP and OLAP system. Should have worked in any of the data warehouse as big data architecture. Should be very good in ANSI SQL. Should have good understanding on data visualization. Should be very comfortable and experience in Data analysis. Should have good knowledge in data cataloging tool and data access policy.

Posted 1 month ago

Apply

8.0 - 10.0 years

0 Lacs

Mulshi, Maharashtra, India

On-site

Job Summary Synechron is seeking an experienced and detail-oriented Senior MSBI Developer expertise in MSBI (Microsoft Business Intelligence) to join our data and analytics team. In this role, you will contribute to designing, developing, and maintaining robust reporting and data integration solutions that support our business objectives. Your expertise will help deliver actionable insights, improve decision-making processes, and enhance overall data management efficiency within the organization. Software Requirements Required Skills: MSBI Suite (including SSIS, SSRS, SSAS) SQL Server (including SQL Server Management Studio and Query Performance Tuning) Versions: Recent versions of SQL Server (2016 or later preferred) Proven experience in creating complex reports, data transformation, and integration workflows Preferred Skills: Power BI or other visualization tools Experience with cloud-based data solutions (e.g., Azure SQL, Synapse Analytics) Overall Responsibilities Develop, implement, and maintain MSBI solutions such as SSIS packages, SSRS reports, and data models to meet business requirements Collaborate with business stakeholders and data teams to gather reporting needs and translate them into scalable solutions Optimize and troubleshoot existing reports and data pipelines to improve performance and reliability Ensure data accuracy, security, and compliance within reporting processes Document solution architectures, workflows, and processes for ongoing support and knowledge sharing Participate in team initiatives to enhance data governance and best practices Contribute to strategic planning for data platform evolution and modernization Technical Skills (By Category) Programming Languages: Required: SQL (Advanced proficiency in query writing, stored procedures, and performance tuning) Preferred: T-SQL scripting for data transformations and automation Databases / Data Management: Required: Deep knowledge of relational database concepts with extensive experience in SQL Server databases Preferred: Familiarity with data warehouse concepts, OLAP cubes, and data mart design Cloud Technologies: Desired: Basic understanding of cloud-based data platforms like Azure Data Factory, Azure Synapse Frameworks and Libraries: Not directly applicable, focus on MSBI tools Development Tools and Methodologies: Experience working within Agile development environments Data pipeline development and testing best practices Security Protocols: Implement data security measures, role-based access controls, and ensure compliance with data privacy policies Experience Requirements 8 to 10 years of professional experience in software development with substantial hands-on MSBI expertise Demonstrated experience in designing and deploying enterprise-level BI solutions Domain experience in finance, healthcare, retail, or similar industries is preferred Alternative candidacy: Extensive prior experience with BI tools and proven success in similar roles may be considered in lieu of exact industry background Day-to-Day Activities Design and develop SSIS data integration workflows to automate data loading processes Create and optimize SSRS reports and dashboards for various organizational units Engage in troubleshooting and resolving technical issues in existing BI solutions Collaborate with data architects, developers, and business analysts to align data solutions with business needs Conduct code reviews, testing, and validation of reports and data pipelines Participate in scrum meetings, planning sessions, and stakeholder discussions Ensure documentation of solutions, processes, and workflows for ease of maintenance and scalability Qualifications Bachelor’s degree or equivalent in Computer Science, Information Technology, or related field Relevant certifications in Microsoft BI or SQL Server (e.g., Microsoft Certified Data Engineer Associate) preferred Ongoing engagement in professional development related to BI, data management, and analytics tools Professional Competencies Analytical mindset with strong problem-solving abilities in data solution development Capable of working collaboratively across diverse teams and communicating technical concepts effectively Stakeholder management skills to interpret and prioritize reporting needs Adaptability to evolving technologies and continuous learning mindset Focus on delivering high-quality, sustainable data solutions with attention to detail Effective time management, prioritizing tasks to meet project deadlines S YNECHRON’S DIVERSITY & INCLUSION STATEMENT Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative ‘Same Difference’ is committed to fostering an inclusive culture – promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more. All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant’s gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law. Candidate Application Notice Show more Show less

Posted 2 months ago

Apply

2.0 years

6 - 8 Lacs

Bengaluru

On-site

Job Description Summary Responsible for developing, testing and implementing data engineering solutions to generate analytical and reporting solutions. Responsible for analyzing and preparing the data needed for data science based outcomes. Also responsible for managing and maintaining metadata data structures besides providing necessary support for post-deployment related activities when needed. Job Description Company Overview Working at GE Aerospace means you are bringing your unique perspective, innovative spirit, drive, and curiosity to a collaborative and diverse team working to advance aerospace for future generations. If you have ideas, we will listen. Join us and see your ideas take flight! Site Overview Established in 2000, the John F. Welch Technology Center (JFWTC) in Bengaluru is our multidisciplinary research and engineering center. Engineers and scientists at JFWTC have contributed to hundreds of aviation patents, pioneering breakthroughs in engine technologies, advanced materials, and additive manufacturing. Roles Overview: In this role, you will: Leverage technical data dictionaries and business glossaries to analyze the datasets Perform data profiling and data analysis for any source systems and the target data repositories Understand metadata and the underlying data structures needed to standardize the data load processes. Develop data mapping specifications based on the results of data analysis and functional requirements Perform a variety of data loads & data transformations using multiple tools and technologies. Build automated Extract, Transform & Load (ETL) jobs based on data mapping specifications Validate the data mapping results and match with the expected results Implement Data Quality (DQ) rules provided Ideal Candidate: Should have experience in data loads & data transformations using multiple tools and technologies. Required Qualification For roles outside USA: Bachelor's Degree in with basic experience. For roles in USA:Bachelor's Degree in with minimum years of experience2years Desired CharacteristicsTechnical Expertise: Creating & updating our Standard Work documentation, from operations, to PBR migrations, to how we use GitHub Budling / re-building clusters as part of our monthly build process Create Python scripts to enable future automation [ex: Python script for creating a S3 bucket] Assisting with our ongoing pursuing of remediating critical vulnerabilities as part of our EVM obligations Ability to understand logical and physical data models, big data storage architecture, data modeling methodologies, metadata management, master data management & data lineage techniques Hands-on experience in programming languages like Java, Python or Scala Hands-on experience in writing SQL scripts for Oracle, MySQL, PostgreSQL or Hive Experience in handling both Online Transaction Processing (OLTP) and Online Analytical Processing (OLAP) data models Experience with Big Data / Hadoop / Spark / Hive / NoSQL database engines (i.e. Cassandra or HBase) Exposure to unstructured datasets and ability to handle XML, JSON file formats Exposure to Extract, Transform & Load (ETL) tools like Informatica or Talend Domain Expertise: Exposure to handling machine or sensor datasets from industrial businesses Knowledge of for industrial applications in a commercial/finance/industrial/manufacturing settings. Exposure to finance and accounting data domains Leadership skills: Partner with other team members to understand the project objectives and resolve technical issues. Communicate project status or challenges in a clear and concise manner to the cross team members. Desired Qualification: Humble: respectful, receptive, agile, eager to learn Transparent: shares critical information, speaks with candor, contributes constructively Focused: quick learner, strategically prioritizes work, committed Leadership ability: strong communicator, decision-maker, collaborative Problem solver: analytical-minded, challenges existing processes, critical thinker At GE Aerospace, we have a relentless dedication to the future of safe and more sustainable flight and believe in our talented people to make it happen. Here, you will have the opportunity to work on really cool things with really smart and collaborative people. Together, we will mobilize a new era of growth in aerospace and defense. Where others stop, we accelerate. Additional Information Relocation Assistance Provided: No

Posted 2 months ago

Apply

0 years

4 - 8 Lacs

Noida

On-site

City/Cities Noida Country India Working Schedule Full-Time Work Arrangement Hybrid Relocation Assistance Available No Posted Date 06-Jun-2025 Job ID 9603 Description and Requirements Position Summary - Expecting someone with guidance and direction to perform development of frontend using Java programming. Someone who has acumen and interest to learn and work on new technologies in process Automation. Someone with good problem-solving skills, has understanding the important of delivery timeline and can operate independently. Job Responsibilities - Involvement in solution planning Convert business specifications to technical specifications. Write clean codes and review codes of the project team members (as applicable) Adhere to Agile Delivery model. Able to solve L3 application related issues. Should be able to scale up on new technologies. Should be able document project artifacts. Technical Skills - Database and data warehouse skills, Object Oriented Programming, Design Patterns, and development knowledge. Azure Cloud experience with Cloud native development as well as migration of existing applications. Hands-on development and implementation experience in Azure Data Factory, Azure Databricks, Azure App services and Azure Service Bus. Agile development and DevSecOps understanding for end to end development life cycle is required. Experience in cutting edge OLAP cube technologies like Kyligence would be a plus Preferably worked in financial domain About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible. Join us!

Posted 2 months ago

Apply

0 years

3 - 5 Lacs

Noida

On-site

City/Cities Noida Country India Working Schedule Full-Time Work Arrangement Hybrid Relocation Assistance Available No Posted Date 06-Jun-2025 Job ID 9602 Description and Requirements Involvement in solution planning Convert business specifications to technical specifications. Write clean codes and review codes of the project team members (as applicable) Adhere to Agile Delivery model. Able to solve L3 application related issues. Should be able to scale up on new technologies. Should be able document project artifacts. Technical Skills: Database and data warehouse skills, Object Oriented Programming, Design Patterns, and development knowledge. Azure Cloud experience with Cloud native development as well as migration of existing applications. Hands-on development and implementation experience in Azure Data Factory, Azure Databricks, Azure App services and Azure Service Bus. Agile development and DevSecOps understanding for end to end development life cycle is required. Experience in cutting edge OLAP cube technologies like Kyligence would be a plus Preferably worked in financial domain About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible. Join us!

Posted 2 months ago

Apply

0 years

0 Lacs

Andhra Pradesh, India

On-site

Strong working knowledge of modern programming languages, ETL/Data Integration tools (preferably SnapLogic) and understanding of Cloud Concepts. SSL/TLS, SQL, REST, JDBC, JavaScript, JSON Has Strong hands-on experience in Snaplogic Design/Development. Has good working experience using various snaps for JDBC, SAP, Files, Rest, SOAP, etc. Good to have the ability to build complex mappings with JSON path expressions, and Python scripting. Good to have experience in ground plex and cloud plex integrations. Has Strong hands-on experience in Snaplogic Design/Development. Has good working experience using various snaps for JDBC, SAP, Files, Rest, SOAP, etc. Should be able to deliver the project by leading a team of the 6-8member team. Should have had experience in integration projects with heterogeneous landscapes. Good to have the ability to build complex mappings with JSON path expressions, flat files and cloud. Good to have experience in ground plex and cloud plex integrations. Experience in one or more RDBMS (Oracle, DB2, and SQL Server, PostgreSQL and RedShift) Real-time experience working in OLAP & OLTP database models (Dimensional models). Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Requirements Description and Requirements Position Summary - Expecting someone with guidance and direction to perform development of frontend using Java programming. Someone who has acumen and interest to learn and work on new technologies in process Automation. Someone with good problem-solving skills, has understanding the important of delivery timeline and can operate independently. Job Responsibilities - Involvement in solution planning Convert business specifications to technical specifications. Write clean codes and review codes of the project team members (as applicable) Adhere to Agile Delivery model. Able to solve L3 application related issues. Should be able to scale up on new technologies. Should be able document project artifacts. Technical Skills - Database and data warehouse skills, Object Oriented Programming, Design Patterns, and development knowledge. Azure Cloud experience with Cloud native development as well as migration of existing applications. Hands-on development and implementation experience in Azure Data Factory, Azure Databricks, Azure App services and Azure Service Bus. Agile development and DevSecOps understanding for end to end development life cycle is required. Experience in cutting edge OLAP cube technologies like Kyligence would be a plus Preferably worked in financial domain About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible . Join us! Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Requirements Description and Requirements Involvement in solution planning Convert business specifications to technical specifications. Write clean codes and review codes of the project team members (as applicable) Adhere to Agile Delivery model. Able to solve L3 application related issues. Should be able to scale up on new technologies. Should be able document project artifacts. Technical Skills: Database and data warehouse skills, Object Oriented Programming, Design Patterns, and development knowledge. Azure Cloud experience with Cloud native development as well as migration of existing applications. Hands-on development and implementation experience in Azure Data Factory, Azure Databricks, Azure App services and Azure Service Bus. Agile development and DevSecOps understanding for end to end development life cycle is required. Experience in cutting edge OLAP cube technologies like Kyligence would be a plus Preferably worked in financial domain About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible . Join us! Show more Show less

Posted 2 months ago

Apply

7.0 years

0 Lacs

Greater Kolkata Area

Remote

As a global leader in cybersecurity, CrowdStrike protects the people, processes and technologies that drive modern organizations. Since 2011, our mission hasn’t changed — we’re here to stop breaches, and we’ve redefined modern security with the world’s most advanced AI-native platform. We work on large scale distributed systems, processing almost 3 trillion events per day. We have 3.44 PB of RAM deployed across our fleet of C* servers - and this traffic is growing daily. Our customers span all industries, and they count on CrowdStrike to keep their businesses running, their communities safe and their lives moving forward. We’re also a mission-driven company. We cultivate a culture that gives every CrowdStriker both the flexibility and autonomy to own their careers. We’re always looking to add talented CrowdStrikers to the team who have limitless passion, a relentless focus on innovation and a fanatical commitment to our customers, our community and each other. Ready to join a mission that matters? The future of cybersecurity starts with you. About The Role The charter of the Data + ML Platform team is to harness all the data that is ingested and cataloged within the Data LakeHouse for exploration, insights, model development, ML Engineering and Insights Activation. This team is situated within the larger Data Platform group, which serves as one of the core pillars of our company. We process data at a truly immense scale. Our processing is composed of various facets including threat events collected via telemetry data, associated metadata, along with IT asset information, contextual information about threat exposure based on additional processing, etc. These facets comprise the overall data platform, which is currently over 200 PB and maintained in a hyper scale Data Lakehouse, built and owned by the Data Platform team. The ingestion mechanisms include both batch and near real-time streams that form the core Threat Analytics Platform used for insights, threat hunting, incident investigations and more. As an engineer in this team, you will play an integral role as we build out our ML Experimentation Platform from the ground up. You will collaborate closely with Data Platform Software Engineers, Data Scientists & Threat Analysts to design, implement, and maintain scalable ML pipelines that will be used for Data Preparation, Cataloging, Feature Engineering, Model Training, and Model Serving that influence critical business decisions. You’ll be a key contributor in a production-focused culture that bridges the gap between model development and operational success. Future plans include generative AI investments for use cases such as modeling attack paths for IT assets. What You’ll Do Help design, build, and facilitate adoption of a modern Data+ML platform Modularize complex ML code into standardized and repeatable components Establish and facilitate adoption of repeatable patterns for model development, deployment, and monitoring Build a platform that scales to thousands of users and offers self-service capability to build ML experimentation pipelines Leverage workflow orchestration tools to deploy efficient and scalable execution of complex data and ML pipelines Review code changes from data scientists and champion software development best practices Leverage cloud services like Kubernetes, blob storage, and queues in our cloud first environment What You’ll Need B.S. in Computer Science, Data Science, Statistics, Applied Mathematics, or a related field and 7 + years related experience; or M.S. with 5+ years of experience; or Ph.D with 6+ years of experience. 3+ years experience developing and deploying machine learning solutions to production. Familiarity with typical machine learning algorithms from an engineering perspective (how they are built and used, not necessarily the theory); familiarity with supervised / unsupervised approaches: how, why, and when and labeled data is created and used 3+ years experience with ML Platform tools like Jupyter Notebooks, NVidia Workbench, MLFlow, Ray, Vertex AI etc. Experience building data platform product(s) or features with (one of) Apache Spark, Flink or comparable tools in GCP. Experience with Iceberg is highly desirable. Proficiency in distributed computing and orchestration technologies (Kubernetes, Airflow, etc.) Production experience with infrastructure-as-code tools such as Terraform, FluxCD Expert level experience with Python; Java/Scala exposure is recommended. Ability to write Python interfaces to provide standardized and simplified interfaces for data scientists to utilize internal Crowdstrike tools Expert level experience with CI/CD frameworks such as GitHub Actions Expert level experience with containerization frameworks Strong analytical and problem solving skills, capable of working in a dynamic environment Exceptional interpersonal and communication skills. Work with stakeholders across multiple teams and synthesize their needs into software interfaces and processes. Experience With The Following Is Desirable Go Iceberg Pinot or other time-series/OLAP-style database Jenkins Parquet Protocol Buffers/GRPC VJ1 Benefits Of Working At CrowdStrike Remote-friendly and flexible work culture Market leader in compensation and equity awards Comprehensive physical and mental wellness programs Competitive vacation and holidays for recharge Paid parental and adoption leaves Professional development opportunities for all employees regardless of level or role Employee Resource Groups, geographic neighbourhood groups and volunteer opportunities to build connections Vibrant office culture with world class amenities Great Place to Work Certified™ across the globe CrowdStrike is proud to be an equal opportunity employer. We are committed to fostering a culture of belonging where everyone is valued for who they are and empowered to succeed. We support veterans and individuals with disabilities through our affirmative action program. CrowdStrike is committed to providing equal employment opportunity for all employees and applicants for employment. The Company does not discriminate in employment opportunities or practices on the basis of race, color, creed, ethnicity, religion, sex (including pregnancy or pregnancy-related medical conditions), sexual orientation, gender identity, marital or family status, veteran status, age, national origin, ancestry, physical disability (including HIV and AIDS), mental disability, medical condition, genetic information, membership or activity in a local human rights commission, status with regard to public assistance, or any other characteristic protected by law. We base all employment decisions--including recruitment, selection, training, compensation, benefits, discipline, promotions, transfers, lay-offs, return from lay-off, terminations and social/recreational programs--on valid job requirements. If you need assistance accessing or reviewing the information on this website or need help submitting an application for employment or requesting an accommodation, please contact us at recruiting@crowdstrike.com for further assistance. Show more Show less

Posted 2 months ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

It's fun to work in a company where people truly BELIEVE in what they are doing! We're committed to bringing passion and customer focus to the business. Senior Fullstack Engineer Work Model-Hybrid Mode Work Location- Bangalore/Mumbai/Pune/Gurgaon/Noida/Chennai/Hyderabad/Coimbatore Fractal is a leading AI & analytics organization. We have a strong Fullstack Team with great leaders accelerating the growth. Our people enjoy a collaborative work environment, exceptional training, and career development as well as unlimited growth opportunities. We have a Glassdoor rating of 4/5 and achieve customer NPS of 9/10. If you like working with a curious, supportive, high-performing team, Fractal is the place for you. Responsibilities As a Fullstack (React and Python) Engineer, you would be part of the team consisting of Scrum Master, Cloud Engineers, AI/ML Engineers, and UI/UX Engineers to build end-to-end Data to Decision Systems. You would report to a Lead Engineer and will be responsible for - Managing, developing & maintaining the backend and frontend for various Data to Decision projects for our Fortune 500 client Work closely with the data science & engineering team to integrate the algorithmic output from the backend REST APIs Work closely with business and product owners to create dynamic infographics with intuitive user controls Participate in UAT, and diagnose & troubleshoot, bugs and application integration issues Create and maintain documentation related to the developed processes and applications Qualification & Experience 5-10 years of demonstrable experience designing, building, and working as a Fullstack Engineer for enterprise web applications Ideally, this would include the following: Expert-level proficiency with JavaScript (ES6), HTML5 & CSS Expert-level proficiency with ReactJS or VueJS Expert-level proficiency with Node.js Expert-level proficiency with Python (3.4+), Django (2.1+) or Flask Or Java Familiarity with common databases (RDBMS such as MySQL & NoSQL such as MongoDB) and data warehousing concepts (OLAP, OLTP) Understanding of REST concepts and building/interacting with REST APIs Deep understanding of a few UI concepts: Cross-browser compatibility and implementing responsive web design Hands-on experience with test driven development, using testing libraries like Jest, PyTest and Nose Familiarity with common JS visualization libraries built using D3, Chart.js, Highcharts, etc. Deep understanding of core backend concepts: Develop and design RESTful services and APIs Develop functional databases, applications, and servers to support websites on the back end Performance optimization and multithreading concepts Experience with deploying and maintaining high traffic infrastructure (performance testing is a plus) In addition, the ideal candidate would have great problem-solving skills, and familiarity with code versioning tools such as Github Good to have Familiarity with Microsoft Azure Cloud Services (particularly Azure Web App, Storage and VM), or familiarity with AWS (EC2 containers) or GCP Services. Experience working with UX designers and bringing design to life Experience with Microservices, Messaging Brokers (e.g., RabbitMQ) Experience with reverse proxy engines such as Nginx, Apache HTTPD Familiarity with Github Actions or any other CI/CD tool (e.g., Jenkins) Education: B.E/B.Tech, BCA, MCA equivalent If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us! Not the right fit? Let us know you're interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest! Show more Show less

Posted 2 months ago

Apply

5.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

About The Role: Our engineering team is growing and we are looking to bring on board a Technical Lead who can help us transition to the next phase of the company. You will be pivotal in refining our system architecture, ensuring the various tech stacks play well with each other, and smoothening the DevOps process. A must have well verse understanding of software paradigm, and curiosity to carve out designs of varying ML, MLOps, and LLMOps problem statements. You will determine to lead your team into right direction towards very end of implementation for underlined project. By joining our team, you will get exposure to working across a swath of modern technologies while building an enterprise-grade ML platform in the most promising area. Responsibilities: Be the bridge between engineering and product teams. Understand long-term product roadmap and architect a system design that will scale with our plans. Take ownership of converting product insights into detailed engineering requirements. Work break-down among team, and orchestrating the development of components for each sprint. Very well verse with solution designing, and documentation (HLD/LLD). Developing "Zero Defect Software" with extreme efficiency by utilizing modern cutting-edge tools (ChatGPT, Co-pilot etc). Adapt, and impart the mindset to build a unit of software that is secured, instrumented, and resilient. Author high-quality, highly-performance, and unit-tested code running on a distributed environment using containers. Continually evaluate and improve DevOps processes for a cloud-native codebase. Strong design skills in defining API Data Contracts / OOAD / Microservices / Data Models and Concurrency concepts. An ardent leader with an obsession for quality, refinement, innovation, and empowering leadership. Qualifications 5-7 years of experience with hands-on experience with development of full fledge Systems/Micro-services using Python, or JS programming. 3+ years experience having Senior engineering responsibilities. 3+ years of people mentorship/leadership experience — managing engineers preferably with good exposure in leading multiple development teams. 3+ years of experience in object-oriented design, and agile development methodologies. Basic experience in developing/deploying cloud-native software using GCP / AWS / Azure. Proven track record building large-scale Product grade (high-throughput, low-latency, and scalable) systems. A well-versed understanding and designing skills of SQL/NoSQL/OLAP DBs. Up-to date with modern cutting-edge technologies to boost efficiency and delivery of team. (Bonus: To have an understanding of Generative AI frameworks/Libraries such RAG, Langchain, LLAMAindex etc.) Skills Strong documentation skills. As a team, we heavily rely on elaborate documentation for everything we are working on. Ability to take authoritative decision, and hold accountability. Ability to motivate, lead, and empower others. Strong independent contributor as well as a team player. Working knowledge of ML and familiarity with concepts of MLOps You will excel in this role if You have a product mindset. You understand, care about, and can relate to our customers. You take ownership, collaborate, and follow through to the very end. You love solving difficult problems, stand your ground, and get what you want from engineers. Resonate with our core values of innovation, curiosity, accountability, trust, fun, and social good. What We Offer An opportunity to work with cutting-edge AI technologies. Collaborative and innovative work environment. Competitive salary and benefits. Show more Show less

Posted 2 months ago

Apply

3.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

This role is for one of our clients Industry: Technology, Information and Media Seniority level: Associate level Min Experience: 3 years Location: Mumbai JobType: full-time About The Role We are looking for an insightful and tech-savvy Data Visualization Analyst with 3–6 years of experience in transforming complex datasets into clear, actionable narratives. If you’re passionate about crafting impactful dashboards, enjoy working with cutting-edge cloud data tools, and thrive in fast-paced environments, this role is for you. You’ll work across functions—partnering with business, engineering, and analytics teams—to design intuitive, scalable, and aesthetically rich dashboards and reports that drive better decisions across the company. What You’ll Do 📊 Visualization Design & Development Create and manage interactive dashboards and data visualizations using tools like Power BI, Tableau, or Looker . Develop custom visuals and reports that are visually appealing, responsive, and tailored to stakeholder needs. 🛠️ Cloud-Based Data Access & Transformation Extract, process, and model large-scale data from Azure Data Lake and Databricks , ensuring performance and accuracy. Collaborate with data engineers to prepare, clean, and transform datasets for reporting and visualization. 🤝 Stakeholder Collaboration Translate business questions into clear analytical visual narratives and performance dashboards. Act as a visualization consultant to product, marketing, and operations teams, understanding their metrics and guiding visual design choices. 🔍 Data Quality & Governance Perform data profiling, validation, and cleansing to ensure data integrity. Maintain documentation and consistency across reports, visuals, and metric definitions. 🚀 Continuous Improvement & Innovation Stay ahead of trends in dashboarding, self-serve analytics, and BI UX best practices. Optimize existing dashboards to enhance performance, usability, and storytelling quality. What You Bring ✔️ Core Skills & Experience 3–6 years of professional experience in data visualization, business intelligence, or analytics. Strong hands-on knowledge of Azure Data Lake , Databricks , and cloud-native data platforms. Advanced proficiency in one or more visualization tools: Power BI , Tableau , Looker , or similar. Solid SQL experience for writing complex queries and transforming datasets. Understanding of data modeling concepts, including star/snowflake schemas and OLAP cubes. 🧠 Nice-to-Have Skills Familiarity with Azure Synapse , Data Factory , or Azure SQL Database . Experience using Python or PySpark for data prep or analytics automation. Exposure to data governance , role-based access control , or data lineage tools. Soft Skills & Traits Strong visual design sense and attention to detail. Ability to explain complex technical topics in simple, business-friendly language. Proactive mindset, keen to take ownership of dashboards from concept to delivery. Comfortable working in agile teams and managing multiple projects simultaneously. Preferred Qualifications Bachelor’s degree in Computer Science, Statistics, Data Analytics, or a related field. Certifications in Azure , Databricks , Power BI , or Tableau are a plus. Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Description We are seeking an experienced Data Architect with strong expertise in designing and implementing data models for OLTP and OLAP systems. Skill / Qualifications Bachelor’s degree in a related field. 6+ experience in data modelling for OLTP and OLAP systems. Strong knowledge of conceptual, logical, and physical data modelling techniques. Hands-on experience with indexing, partitioning, and data sharding. Understanding of database performance variables and optimization techniques. Proficiency in using data modelling tools, preferably DBSchema. Experience with GCP databases like AlloyDB, CloudSQL, and BigQuery. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Functional knowledge of the mutual fund industry. Job Responsibilities Design and develop conceptual, logical, and physical data models for OLTP and OLAP systems. Optimize database performance through indexing, partitioning, and data sharding strategies. Ensure high performance, scalability, and reliability of database systems for near-real-time reporting and application interactions. Collaborate with data engineers, analysts, and stakeholders to translate business requirements into effective data models. Work with GCP databases such as AlloyDB, CloudSQL, and BigQuery to design and implement data solutions. Utilize data modelling tools such as DBSchema to create and maintain database schemas. Stay updated with industry best practices and emerging trends in data modelling and database technologies. Benefits Competitive Market Rate (Depending on Experience) Show more Show less

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies