Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 9.0 years
18 - 25 Lacs
Bangalore Rural, Bengaluru
Work from Office
ETL Tester,ETL/Data Migration Testing,AWS to GCP data migration, PostgreSQL, AlloyDB, Presto, BigQuery, S3, and GCS,Python for test automation,data warehousing and cloud-native tools,PostgreSQL to AlloyDB,Presto to BigQuery,S3 to Google Cloud Storage
Posted 1 week ago
6.0 - 10.0 years
8 - 12 Lacs
Bengaluru, Karnataka, India
On-site
Job Description Job Title: Data Modeller Experience: 6+ Years Location: Bangalore Work Mode: Onsite Job Role: We are seeking a skilled Data Modeller with expertise in designing data models for both OLTP and OLAP systems. The ideal candidate will have deep knowledge of data modelling principles and a strong understanding of database performance optimization, especially in near-real-time reporting environments. Prior experience with GCP databases and data modelling tools is essential. Responsibilities: ? Design and implement data models (Conceptual, Logical, and Physical) for complex business requirements ? Develop scalable OLTP and OLAP models to support enterprise data needs ? Optimize database performance through effective indexing, partitioning, and data sharding techniques ? Work closely with development and analytics teams to ensure alignment of models with application and reporting needs ? Use data modelling tools like Erwin, DBSchema, or similar to create and maintain models ? Implement best practices for data quality, governance, and consistency across systems ? Leverage GCP database solutions such as AlloyDB, CloudSQL, and BigQuery ? Collaborate with business stakeholders, especially within the mutual fund domain (preferred), to understand data requirements Requirements: ? 6+ years of hands-on experience in data modelling for OLTP and OLAP systems ? Strong command over data modelling fundamentals (Conceptual, Logical, Physical) ? In-depth knowledge of indexing, partitioning, and data sharding strategies ? Experience with real-time and near-real-time reporting systems ? Proficiency in data modelling tools preferably DBSchema or Erwin ? Familiarity with GCP databases like AlloyDB, CloudSQL, and BigQuery ? Functional understanding of the mutual fund industry is a plus ? Must be willing to work from Chennai office presence is mandatory Technical Skills: Data Modelling (Conceptual, Logical, Physical), OLTP, OLAP, Indexing, Partitioning, Data Sharding, Database Performance Tuning, Real-Time/Near-Real-Time Reporting, DBSchema, Erwin, AlloyDB, CloudSQL, BigQuery.
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Modeling Engineer specializing in Near Real-time Reporting, you will be responsible for creating robust and optimized schemas to facilitate near real-time data flows for operational and analytical purposes within Google Cloud environments. Your primary focus will be on designing models that ensure agility, speed, and scalability to support high-throughput, low-latency data access needs. Your key responsibilities will include designing data models that align with streaming pipelines, developing logical and physical models tailored for near real-time reporting, implementing strategies such as caching, indexing, and materialized views to enhance performance, and ensuring data integrity, consistency, and schema quality during rapid changes. To excel in this role, you must possess experience in building data models for real-time or near real-time reporting systems, hands-on expertise with GCP platforms such as BigQuery, CloudSQL, and AlloyDB, and a solid understanding of pub/sub, streaming ingestion frameworks, and event-driven design. Additionally, proficiency in indexing strategies and adapting schemas in high-velocity environments is crucial. Preferred skills for this position include exposure to monitoring, alerting, and observability tools, as well as functional familiarity with financial reporting workflows. Moreover, soft skills like proactive adaptability in fast-paced data environments, effective verbal and written communication, and a collaborative, solution-focused mindset will be highly valued. By joining our team, you will have the opportunity to design the foundational schema for mission-critical real-time systems, contribute to the performance and reliability of enterprise data workflows, and be part of a dynamic GCP-focused engineering team. Skills required for this role include streaming ingestion frameworks, BigQuery, reporting, modeling, AlloyDB, pub/sub, CloudSQL, Google Cloud Platform (GCP), data management, real-time reporting, indexing strategies, and event-driven design.,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
You are a Database Performance & Data Modeling Specialist with a primary focus on optimizing schema structures, tuning SQL queries, and ensuring that data models are well-prepared for high-volume, real-time systems. Your responsibilities include designing data models that balance performance, flexibility, and scalability, conducting performance benchmarking to identify bottlenecks and propose improvements, analyzing slow queries to recommend indexing, denormalization, or schema revisions, monitoring query plans, memory usage, and caching strategies for cloud databases, and collaborating with developers and analysts to optimize application-to-database workflows. You must possess strong experience in database performance tuning, especially in GCP platforms like BigQuery, CloudSQL, and AlloyDB. Proficiency in schema refactoring, partitioning, clustering, and sharding techniques is essential. Familiarity with profiling tools, slow query logs, and GCP monitoring solutions is required, along with SQL optimization skills including query rewriting and execution plan analysis. Preferred skills include a background in mutual fund or high-frequency financial data modeling, hands-on experience with relational databases like PostgreSQL, MySQL, distributed caching, materialized views, and hybrid model structures. Soft skills that are crucial for this role include being precision-driven with an analytical mindset, a clear communicator with attention to detail, and possessing strong problem-solving and troubleshooting abilities. By joining this role, you will have the opportunity to shape high-performance data systems from the ground up, play a critical role in system scalability and responsiveness, and work with high-volume data in a cloud-native enterprise setting.,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Cloud Data Modeler with 6 to 9 years of experience in GCP environments, you will play a crucial role in designing schema architecture, creating performance-efficient data models, and guiding teams on cloud-based data integration best practices. Your expertise will be focused on GCP data platforms such as BigQuery, CloudSQL, and AlloyDB. Your responsibilities will include architecting and implementing scalable data models for cloud data warehouses and databases, optimizing OLTP/OLAP systems for reporting and analytics, supporting cloud data lake and warehouse architecture, and reviewing and optimizing existing schemas for cost and performance on GCP. You will also be responsible for defining documentation standards, ensuring model version tracking, and collaborating with DevOps and DataOps teams for deployment consistency. Key Requirements: - Deep knowledge of GCP data platforms including BigQuery, CloudSQL, and AlloyDB - Expertise in data modeling, normalization, and dimensional modeling - Understanding of distributed query engines, table partitioning, and clustering - Familiarity with DBSchema or similar tools Preferred Skills: - Prior experience in BFSI or asset management industries - Working experience with Data Catalogs, lineage, and governance tools Soft Skills: - Collaborative and consultative mindset - Strong communication and requirements gathering skills - Organized and methodical approach to data architecture challenges By joining our team, you will have the opportunity to contribute to modern data architecture in a cloud-first enterprise, influence critical decisions around GCP-based data infrastructure, and be part of a future-ready data strategy implementation team.,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
You are a skilled Data Modeler with expertise in using DBSchema within GCP environments. In this role, you will be responsible for creating and optimizing data models for both OLTP and OLAP systems, ensuring they are well-designed for performance and maintainability. Your key responsibilities will include developing conceptual, logical, and physical models using DBSchema, aligning schema design with application requirements, and optimizing models in BigQuery, CloudSQL, and AlloyDB. Additionally, you will be involved in supporting schema documentation, reverse engineering, and visualization tasks. Your must-have skills for this role include proficiency in using the DBSchema modeling tool, strong experience with GCP databases such as BigQuery, CloudSQL, and AlloyDB, as well as knowledge of OLTP and OLAP system structures and performance tuning. It is essential to have expertise in SQL and schema evolution/versioning best practices. Preferred skills include experience integrating DBSchema with CI/CD pipelines and knowledge of real-time ingestion pipelines and federated schema design. As a Data Modeler, you should possess soft skills such as being detail-oriented, organized, and communicative. You should also feel comfortable presenting schema designs to cross-functional teams. By joining this role, you will have the opportunity to work with industry-leading tools in modern GCP environments, enhance modeling workflows, and contribute to enterprise data architecture with visibility and impact.,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
You are a versatile Data Model Developer with 6 to 9 years of experience, proficient in designing robust data models across cloud (GCP) and traditional RDBMS environments. Your role involves collaborating with cross-functional teams to develop schemas that cater to both operational systems and analytical use cases. Your key responsibilities include designing and implementing scalable data models for cloud (GCP) and traditional RDBMS, supporting hybrid data architectures integrating real-time and batch workflows, collaborating with engineering teams for seamless schema implementation, documenting conceptual, logical, and physical models, assisting in ETL and data pipeline alignment with schema definitions, and monitoring and refining performance through partitioning and indexing strategies. You must have experience with GCP data services like BigQuery, CloudSQL, AlloyDB, proficiency in relational databases such as PostgreSQL, MySQL, or Oracle, solid grounding in OLTP/OLAP modeling principles, familiarity with schema design tools like DBSchema, ER/Studio, and SQL expertise for query performance optimization. Preferred skills include experience working in hybrid cloud/on-prem data architectures, functional knowledge in BFSI or asset management domains, and knowledge of metadata management and schema versioning. Soft skills required for this role include adaptability to cloud and legacy tech stacks, clear communication with engineers and analysts, and strong documentation and collaboration skills. Joining this role will allow you to contribute to dual-mode data architecture (cloud + on-prem), solve real-world data design challenges in regulated industries, and have the opportunity to influence platform migration and modernization.,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Architect specializing in OLTP & OLAP Systems, you will play a crucial role in designing, optimizing, and governing data models for both OLTP and OLAP environments. Your responsibilities will include architecting end-to-end data models across different layers, defining conceptual, logical, and physical data models, and collaborating closely with stakeholders to capture functional and performance requirements. You will need to optimize database structures for real-time and analytical workloads, enforce data governance, security, and compliance best practices, and enable schema versioning, lineage tracking, and change control. Additionally, you will review query plans and indexing strategies to enhance performance. To excel in this role, you must possess a deep understanding of OLTP and OLAP systems architecture, along with proven experience in GCP databases such as BigQuery, CloudSQL, and AlloyDB. Your expertise in database tuning, indexing, sharding, and normalization/denormalization will be critical, as well as proficiency in data modeling tools like DBSchema, ERWin, or equivalent. Familiarity with schema evolution, partitioning, and metadata management is also required. Experience in the BFSI or mutual fund domain, knowledge of near real-time reporting and streaming analytics architectures, and familiarity with CI/CD for database model deployments are preferred skills that will set you apart. Strong communication, stakeholder management, strategic thinking, and the ability to mentor data modelers and engineers are essential soft skills for success in this position. By joining our team, you will have the opportunity to own the core data architecture for a cloud-first enterprise, bridge business goals with robust data design, and work with modern data platforms and tools. If you are looking to make a significant impact in the field of data architecture, this role is perfect for you.,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
You are looking for a Data Modelling Consultant with 6 to 9 years of experience to work in Chennai office. As a Data Modelling Consultant, your role will involve providing end-to-end modeling support for OLTP and OLAP systems hosted on Google Cloud. Your responsibilities will include designing and validating conceptual, logical, and physical models for cloud databases, translating requirements into efficient schema designs, and supporting data model reviews, tuning, and implementation. You will also guide teams on best practices for schema evolution, indexing, and governance to enable usage of models in real-time applications and analytics platforms. To succeed in this role, you must have strong experience in modeling across OLTP and OLAP systems, hands-on experience with GCP tools like BigQuery, CloudSQL, and AlloyDB, and the ability to understand business rules and translate them into scalable structures. Additionally, familiarity with partitioning, sharding, materialized views, and query optimization is essential. Preferred skills for this role include experience with BFSI or financial domain data schemas, familiarity with modeling methodologies and standards such as 3NF and star schema. Soft skills like excellent stakeholder communication, collaboration, strategic thinking, and attention to scalability are also important. Joining this role will allow you to deliver advisory value across critical data initiatives, influence the modeling direction for a data-driven organization, and be at the forefront of GCP-based enterprise data transformation.,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Cloud Data Architect specializing in BigQuery and CloudSQL at our Chennai office, you will play a crucial role in leading the design and implementation of scalable, secure, and high-performing data architectures using Google Cloud technologies. Your expertise will be essential in shaping architectural direction and ensuring that data solutions meet enterprise-grade standards. Your responsibilities will include designing data architectures that align with performance, cost-efficiency, and scalability needs, implementing data models, security controls, and access policies across GCP platforms, leading cloud database selection, schema design, and tuning for analytical and transactional workloads, collaborating with DevOps and DataOps teams to deploy and manage data environments, ensuring best practices for data governance, cataloging, and versioning, and enabling real-time and batch integrations using GCP-native tools. To excel in this role, you must possess deep knowledge of BigQuery, CloudSQL, and the GCP data ecosystem, along with strong experience in schema design, partitioning, clustering, and materialized views. Hands-on experience in implementing data encryption, IAM policies, and VPC configurations is crucial, as well as an understanding of hybrid and multi-cloud data architecture strategies and data lifecycle management. Proficiency in GCP cost optimization is also required. Preferred skills for this role include experience with AlloyDB, Firebase, or Spanner, familiarity with LookML, dbt, or DAG-based orchestration tools, and exposure to the BFSI domain or financial services architecture. In addition to technical skills, soft skills such as visionary thinking with practical implementation skills, strong communication, and cross-functional leadership are highly valued. Previous experience guiding data strategy in enterprise settings will be advantageous. Joining our team will give you the opportunity to own data architecture initiatives in a cloud-native ecosystem, drive innovation through scalable and secure GCP designs, and collaborate with forward-thinking data and engineering teams. Skills required for this role include IAM policies, Spanner, cloud, schema design, data architecture, GCP data ecosystem, dbt, GCP cost optimization, data, AlloyDB, data encryption, data lifecycle management, BigQuery, LookML, VPC configurations, partitioning, clustering, materialized views, DAG-based orchestration tools, Firebase, and CloudSQL.,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Modeller specializing in GCP and Cloud Databases, you will play a crucial role in designing and optimizing data models for both OLTP and OLAP systems. Your expertise in cloud-based databases, data architecture, and modeling will be essential in collaborating with engineering and analytics teams to ensure efficient operational systems and real-time reporting pipelines. You will be responsible for designing conceptual, logical, and physical data models tailored for OLTP and OLAP systems. Your focus will be on developing and refining models that support performance-optimized cloud data pipelines, implementing models in BigQuery, CloudSQL, and AlloyDB, as well as designing schemas with indexing, partitioning, and data sharding strategies. Translating business requirements into scalable data architecture and schemas will be a key aspect of your role, along with optimizing for near real-time ingestion, transformation, and query performance. You will utilize tools like DBSchema for collaborative modeling and documentation while creating and maintaining metadata and documentation around models. In terms of required skills, hands-on experience with GCP databases (BigQuery, CloudSQL, AlloyDB), a strong understanding of OLTP and OLAP systems, and proficiency in database performance tuning are essential. Additionally, familiarity with modeling tools such as DBSchema or ERWin, as well as a proficiency in SQL, schema definition, and normalization/denormalization techniques, will be beneficial. Preferred skills include functional knowledge of the Mutual Fund or BFSI domain, experience integrating with cloud-native ETL and data orchestration pipelines, and familiarity with schema version control and CI/CD in a data context. In addition to technical skills, soft skills such as strong analytical and communication abilities, attention to detail, and a collaborative approach across engineering, product, and analytics teams are highly valued. Joining this role will provide you with the opportunity to work on enterprise-scale cloud data architectures, drive performance-oriented data modeling for advanced analytics, and collaborate with high-performing cloud-native data teams.,
Posted 2 weeks ago
2.0 - 6.0 years
4 - 8 Lacs
Gurugram
Work from Office
About the Role: Grade Level (for internal use): 09 The Team: Automotive Mastermind was founded on the idea that there are patterns in peoples behavior that, with the right logic, can be used to predict future outcomes.Our software helps automotive dealerships and sales teams better understand and predict exactly which customers are ready to buy, the reasons why, and the key offers and incentives most likely to close the sale.Our culture is creative and entrepreneurial where everyone contributes to company goals in a very real way. We are a hardworking group, but we have a lot of fun with what we do and are looking for new people with a similar mindset to join the organization. The Impact: As a Quality Engineer you will collaborate with members of both, Product and Development Teams to help them make informed decisions on releases of one of the best tools there is for car dealerships in the United States. Whats in it for you: Possibility to work on a project in a very interesting domain - Automotive industry in the United States, and influence the quality of one of the best tools there is for car dealerships. Affect processes and tools used for Quality Engineering. Our Team has a high degree of autonomy in automotive Mastermind organization to decide what tools and processes we will use. Responsibilities: Own and be responsible for testing and delivery of product or core modules. Assessing the quality, usability and functionality of each release. Reviewing software requirement and capable in preparing test scenarios for complex business rules Interact with the stakeholders to understand the detailed requirements and expectations Be able to gain technical knowledge and aim to be a quality SME(s) in core functional components Developing and organizing QA Processes for assigned projects to align with overall QA goals Designing and implementing a test automation strategy supporting multiple product development teams Leading efforts for related automation projects, design and code reviews Producing regular reports on the status and quality of software releases and be prepared to speak to findings in an informative way to all levels of audiences. What Were Looking For: Participate in and improve the whole lifecycle of servicesfrom inception and design, through deployment, operation, and refinement. Participate in the release planning process to review functional specifications and create release plans. Collaborate with software engineers to design verification test plans. Design regression test suites and review with engineering, applications and the field organization. Produce regular reports on the status and quality of software releases and be prepared to speak to findings in an informative way to all levels of audience. Assess the quality, usability and functionality of each release. Develop and organize QA Processes for assigned projects to align with overall QA goals Lead and train a dynamically changing team of colleagues who participate in testing processes Exhibit expertise in handling large scale programs/projects that involve multiple stakeholders (Product, Dev, DevOps) Maintain a leading edge understanding of QA as related to interactive technologies best practices Design and implement test automation strategy for multiple product development teams at the onset of the project. Lead efforts for related automation projects, design and code reviews. Work closely with leadership and IT to provide input into the design and implementation of the automation framework. Work with Architecture, Engineering, Quality Engineering, IT, and Product Operations leaders to create and implement processes that accelerate the delivery of new features and products with high quality and at scale. Develop and contribute to a culture of high performance, transparency and continuous improvement as it relates to the infrastructure services and streamlining of the development pipeline. Participate in a diverse team of talented engineers globally, providing guidance, support and clear priorities. ? Who you are: Total Experience: 2 to 6 years. Hands on experience with at least 2 or more of leading testing tools/framework like Playwright, Robot Framework, K6, Jmeter. Hands on experience working on Python. Experience with Databases SQL/NoSQL. Experience working on CloudNative Applications. Hands on experience with Google Cloud Services like Kubernetes, Composer, Dataplex, Pub-Sub, BigQuery, AlloyDb, CloudSQL , lookerstudio etc. Strong analytical skills and ability to solve complex technical problems. API testing - must have understanding of RESTful design / best practices. Hands on experience testing APIs and test tools Experience with load / stress / performance testing and tools, Experience with Azure DevOps (or other similar issue/bug tracking systems) is required, Experience working with Cloud native applications. Ability to think abstract to ensure ability to not conform to the norm. Norms do not find bugs quickly, Experience working in an Agile software development organization, Experience supporting development and product teams Excellent verbal, written, and interpersonal communication skills; ability to interact with all levels of an organization Ability to work in an advisory capacity to identify key technical and business problems, develop and evaluate. Grade: 08 / 09 Job Location: Gurugram Hybrid Mode: twice a week work from office. Shift Time: 12 pm to 9 pm IST.
Posted 1 month ago
6.0 - 8.0 years
8 - 12 Lacs
Hyderabad
Work from Office
Job Description At Technoidentity, were a Data + AI product engineering company trusted for delivering scalable and modern enterprise solutions. Join us as a Database Developer and play a critical role in migrating enterprise data systems from Oracle to PostgreSQL on AlloyDB , using cloud-native and performanceoptimized practices. If you're hands-on with PL/SQL and PL/pgSQL, experienced in performance tuning and validation, and confident in transforming complex data logic—this role is designed for you. What’s in it for You? Migration Engineering at Scale – Execute real-world migrations of legacy Oracle databases to AlloyDB/PostgreSQL using structured and proven methodologies. Database Rewriting & Optimization – Rewrite stored procedures and data logic in PL/pgSQL , reconfigure indexes, and tune performance for high-throughput systems. Modern Stack Exposure – Work with cloud-native services and best practices as part of enterprise modernization journeys. Data Validation Accountability – Lead source-to-target comparison of Oracle vs. AlloyDB, ensuring no data loss or logic drift post-migration. Proactive Culture – Take ownership of data logic, performance metrics, and delivery quality in a team that values accountability. Requirements What Will You Be Doing? Database Migration & Transformation Migrate Oracle databases to PostgreSQL (AlloyDB) through manual and automated processes. Rewrite stored procedures, packages, triggers, and SQL logic from PL/SQL to PL/pgSQL . Recreate schemas, constraints, indexes, views , and other DB objects in the target PostgreSQL environment. Performance Optimization Design and implement table partitioning , advanced indexing , and optimized query plans . Use tools like EXPLAIN ANALYZE , pg_stat_statements , and cloud-native monitoring to track and improve query performance. Collaborate with infrastructure and DevOps teams to manage DB provisioning and deployment workflows. Data Validation & Quality Assurance Perform data comparison between Oracle and AlloyDB to ensure data integrity postmigration. Develop scripts to validate row-level accuracy , transformation rules, and reconciliation reports. Handle large datasets efficiently, including bulk data processing and archival validation . Collaboration & Documentation Partner with Business Analysts, Tech Leads, and QA to translate business logic embedded in Oracle code to scalable PostgreSQL logic. Maintain documentation on migration steps, schema differences, validation strategies , and rollback procedures. What Makes You the Perfect Fit? 5+ years of experience in database development and migration across Oracle and PostgreSQL ecosystems. Strong expertise in PL/SQL and PL/pgSQL , including stored procedures, error handling, and performance tuning. Deep understanding of data validation practices , SQL reconciliation, and performance bench marking. Hands-on with partitioning, indexing strategies, schema conversion , and legacy modernization workflows. Familiarity with version control, Agile practices, and collaborative development environments. Nice to Have Experience working in cloud environments , particularly with PostgreSQL-based systems such as AlloyDB or Aurora. Exposure to Python or Shell scripting for automation, validation, or DevOps hand offs. Knowledge of BTEQ-to-RSQL conversion or legacy script transformation for data workflows. Personal Attributes Strong ownership of assigned modules with a proactive approach to problem-solving. Detail-oriented with a passion for data integrity and completeness . Accountable, collaborative, and delivery-focused in dynamic cross-functional environments.
Posted 2 months ago
6.0 - 8.0 years
8 - 12 Lacs
Hyderabad
Work from Office
At Technoidentity, were a Data + AI product engineering company trusted for delivering scalable and modern enterprise solutions. Join us as a Technical Analyst and play a pivotal role in our data migration programs by bridging Oracle SQL logic and modern application architectures. If you have a strong grasp of full-stack systems, programming logic, and data flowbut prefer analysis over hands-on codingthis role is for you. Whats in it for You? Legacy-to-Modern Transformation – Work at the heart of large-scale database migration programs, decoding legacy Oracle PL/SQL into actionable development specifications. Full Stack Awareness – Engage with systems spanning UI, middleware, and databases to support modern cloud-native architectures. Cross-Functional Collaboration – Act as a technical bridge between SQL-heavy legacy logic and Agile development teams working on modern stacks. Modern Data Ecosystem – Contribute to projects involving PostgreSQL/AlloyDB, microservices, and cloud-native technologies. Requirements: What Will You Be Doing? SQL Logic Analysis & Translation Interpret and analyze Oracle SQL / PL-SQL logic, including stored procedures, packages, and functions. Break down complex SQL rules into structured, developer-consumable requirements, user stories, or logic specifications. Map legacy data flows and transformation rules into technical documentation aligned with new architecture. Full Stack System Understanding Understand application structure across UI layer, middle tier, and database layer to identify integration points and design constraints. Collaborate with architects and developers to align business logic with the target tech stack (e.g., Java, Python, C# on PostgreSQL/AlloyDB). Documentation & Requirements Engineering: Write clear and concise technical requirement documents, field mappings, and logic flow diagrams. Actively support Agile ceremonies and backlog grooming by providing technical context and answering requirement-level questions. Collaboration & Project Enablement: Partner with product owners, developers, QA engineers, and DevOps teams to ensure clarity and traceability across the development lifecycle. Facilitate smooth hands-off functional logic and edge-case details uncovered from legacy systems. What Makes You the Perfect Fit? 6+ years of experience in technical analysis, solution engineering, or system modernization roles. Strong understanding of Oracle SQL and PL/SQL – including the ability to interpret complex procedures and packages. Broad knowledge of software architecture including UI, APIs/middleware, and relational databases. Ability to read code in Java, Python, or C# and understand logical flow (no coding required). Experience supporting data migration, system transformation, or legacy modernization initiatives. Proficiency in writing technical specs, logic maps, and data field mappings for crossfunctional teams. Comfortable working in Agile environments using tools like ADO, Wiki and Lucid chart/Visio. Nice to Have Exposure to cloud-native platforms (GCP/AWS/Azure) and modern databases (AlloyDB/PostgreSQL). Familiarity with microservices, REST APIs, and data serialization formats (JSON, XML). Knowledge of data quality rules and validation frameworks in ETL/data migration. Awareness of CI/CD pipelines, DevOps, and version control with Git. Personal Attributes: Strong sense of ownership, quality, and accountability in delivering logical clarity. Exceptional problem-solving skills with the ability to “reverse engineer” business rules from SQL. Clear communicator and collaborative partner across technical and non-technical teams.
Posted 2 months ago
10.0 - 18.0 years
35 - 65 Lacs
Pune, Chennai, Bengaluru
Hybrid
Interested candidates must revert with updated cv to sawanti.mandal@citiustech.com Job Description Job Title MERN Technical Architect Who we are CitiusTech - Shaping Healthcare Possibilities. As one of the worlds fastest growing healthcare technology services companies, CitiusTech plays a significant role in shaping the way healthcare is delivered to patients. We are a team of 8500+ world-class technology and analytics professionals serving 140+ healthcare organizations worldwide, driven by a strong sense of purpose to accelerate innovation in healthcare and make a meaningful impact to human lives. We focus on building highly motivated engineering teams and thought leaders with an entrepreneurial mindset, centred on our core values of passion, respect, openness, unity and depth of knowledge. Our success lies in our ability to create a transparent, collaborative and non-hierarchical work environment that values opinion and empowers you to meet your personal and professional aspirations. This is what has helped make us Indias #1 healthcare technology services company and the first Unicorn in this space, way back in 2019. With the exponential growth in healthcare technology adoption, CitiusTech is uniquely positioned to drive disruptive change across the healthcare industry. At CitiusTech, you become a part of this incredible growth story. What is in it for you? As a Technical Architect MERN , you will collaborate closely with key stakeholders to design and develop a core product for one of our valued clients. Your role will require deep technical expertise and a strategic mindset to ensure scalability, performance, and exceptional user experience. Responsibilities: - Lead the architectural design and development of a scalable, high-performance product with a strong focus on user experience. Spearhead the modernization of the Radiology Worklist Programs user interface by developing a Micro Frontend (MFE) application using React. Seamlessly integrate the new React-based MFE with existing Angular components, leveraging Material-UI (MUI) for a consistent design system. Enhance and refactor backend services built in Node.js to support new features, including customizable user-defined worklists. Conduct a comparative analysis of AlloyDB and PostgreSQL to determine the optimal database solution for advanced analytics. Redesign and optimize the database schema to support new functionalities and improve overall system efficiency. Design and implement an intermediate caching layer using Elasticsearch to boost application performance and scalability. Experience: - 10+ Years Location: - Any Educational Qualifications: - Engineering Degree BE / ME / BTech / M Tech / B.Sc. / M.Sc. Technical certification in multiple technologies is desirable. Skills: - Mandatory Technical skills NodeJS, ReactJS, PostgreSQL Good to have skills: - AlloyDB Our commitment To combine the best of IT services, consulting, products, accelerators, and frameworks with a client-first mindset and next-gen tech understanding. Together, were humanizing healthcare to make a positive impact on human lives. What drives us At CitiusTech, we believe in making a tangible difference in healthcare. We constantly explore new ways to transform the industry, from AI-driven solutions to advanced data analytics and cloud computing. Our collaborative culture, combined with a relentless drive for excellence, positions us as innovators reshaping the healthcare landscape, one solution at a time. Life@CitiusTech We focus on building highly motivated engineering teams and thought leaders with an entrepreneurial mindset centered on our core values of Passion, Respect, Openness, Unity, and Depth (PROUD) of knowledge . Our success lies in creating a fun, transparent, non-hierarchical, diverse work culture that focuses on continuous learning and work-life balance. Rated by our employees as the Great Place to Work for according to the Great Place to Work survey. We offer you comprehensive benefits to ensure you have a long and rewarding career with us. Our EVP Be You Be Awesome is our EVP. It reflects our continuing efforts to create CitiusTech as a great workplace where our employees can thrive, personally and professionally. It encompasses the unique benefits and opportunities we offer to support your growth, well-being, and success throughout your journey with us and beyond. Together with our clients, we are solving some of the greatest healthcare challenges and positively impacting human lives. Welcome to the world of Faster Growth, Higher Learning, and Stronger Impact. Here is an opportunity for you to make a difference and collaborate with global leaders to shape the future of healthcare and positively impact human lives. To learn more about CitiusTech, visit https://www.citiustech.com/careers and follow us on Happy applying! Best Regards, Sawanti Mandal AM- Recruitments @Citiustech Healthcare Technology 7208951606
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough