Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
We are looking for an experienced Data Engineer specializing in Apache Cassandra to oversee the processing of large-scale data. The ideal candidate will have a proven track record in managing crores of records, with a specific focus on high-performance CRUD operations, efficient data retrieval through effective sharding strategies, and the integration of Cassandra with dashboards and reporting tools. Key Responsibilities: - Design, implement, and manage scalable data models and Cassandra clusters. - Handle and process massive datasets (crores of records) with an emphasis on performance and reliability. - Optimize performance for large datasets, prioritizing speed and reliability. - Implement data sharding and partitioning strategies to facilitate distributed data storage. - Ensure fast and reliable CRUD operations on datasets with high volume. - Integrate Cassandra with visualization/dashboard tools to enable real-time data access and reporting. - Monitor, troubleshoot, and fine-tune cluster performance and availability. Required Skills: - Strong hands-on experience with Apache Cassandra in production environments. - Demonstrated ability to manage and query crores of records with high performance. - In-depth knowledge of Cassandra architecture, data modeling, and tunable consistency. - Proficiency in scripting languages such as Python/Bash, and experience in performance tuning. - Experience in integrating Cassandra with dashboarding or reporting tools. - Understanding of distributed systems and NoSQL best practices. - Familiarity with tools like Spark and Kafka. Please note that the above is a summary of the job description provided. For more information, please refer to the original source.,
Posted 5 days ago
6.0 - 10.0 years
8 - 12 Lacs
Bengaluru, Karnataka, India
On-site
Job Description Job Title: Data Modeller Experience: 6+ Years Location: Bangalore Work Mode: Onsite Job Role: We are seeking a skilled Data Modeller with expertise in designing data models for both OLTP and OLAP systems. The ideal candidate will have deep knowledge of data modelling principles and a strong understanding of database performance optimization, especially in near-real-time reporting environments. Prior experience with GCP databases and data modelling tools is essential. Responsibilities: ? Design and implement data models (Conceptual, Logical, and Physical) for complex business requirements ? Develop scalable OLTP and OLAP models to support enterprise data needs ? Optimize database performance through effective indexing, partitioning, and data sharding techniques ? Work closely with development and analytics teams to ensure alignment of models with application and reporting needs ? Use data modelling tools like Erwin, DBSchema, or similar to create and maintain models ? Implement best practices for data quality, governance, and consistency across systems ? Leverage GCP database solutions such as AlloyDB, CloudSQL, and BigQuery ? Collaborate with business stakeholders, especially within the mutual fund domain (preferred), to understand data requirements Requirements: ? 6+ years of hands-on experience in data modelling for OLTP and OLAP systems ? Strong command over data modelling fundamentals (Conceptual, Logical, Physical) ? In-depth knowledge of indexing, partitioning, and data sharding strategies ? Experience with real-time and near-real-time reporting systems ? Proficiency in data modelling tools preferably DBSchema or Erwin ? Familiarity with GCP databases like AlloyDB, CloudSQL, and BigQuery ? Functional understanding of the mutual fund industry is a plus ? Must be willing to work from Chennai office presence is mandatory Technical Skills: Data Modelling (Conceptual, Logical, Physical), OLTP, OLAP, Indexing, Partitioning, Data Sharding, Database Performance Tuning, Real-Time/Near-Real-Time Reporting, DBSchema, Erwin, AlloyDB, CloudSQL, BigQuery.
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
Wipro Limited is a leading technology services and consulting company dedicated to developing innovative solutions that cater to the most complex digital transformation needs of clients. Our comprehensive range of consulting, design, engineering, and operational capabilities enables us to assist clients in achieving their most ambitious goals and establishing sustainable, future-ready businesses. With a global presence of over 230,000 employees and business partners spanning 65 countries, we remain committed to supporting our customers, colleagues, and communities in navigating an ever-evolving world. We are currently seeking an individual with hands-on experience in data modeling for both OLTP and OLAP systems. The ideal candidate should possess a deep understanding of Conceptual, Logical, and Physical data modeling, coupled with a robust grasp of indexing, partitioning, and data sharding, supported by practical experience. Experience in identifying and mitigating factors impacting database performance for near-real-time reporting and application interaction is essential. Proficiency in at least one data modeling tool, preferably DB Schema, is required. Additionally, functional knowledge of the mutual fund industry would be beneficial. Familiarity with GCP databases such as Alloy DB, Cloud SQL, and Big Query is preferred. The role demands willingness to work from our Chennai office, with a mandatory presence on-site at the customer site requiring five days of work per week. Cloud-PaaS-GCP-Google Cloud Platform is a mandatory skill set for this position. The successful candidate should have 5-8 years of relevant experience and should be prepared to contribute to the reimagining of Wipro as a modern digital transformation partner. We are looking for individuals who are inspired by reinvention - of themselves, their careers, and their skills. At Wipro, we encourage continuous evolution, reflecting our commitment to adapt to the changing world around us. Join us in a business driven by purpose, where you have the freedom to shape your own reinvention. Realize your ambitions at Wipro. We welcome applications from individuals with disabilities. For more information, please visit www.wipro.com.,
Posted 2 weeks ago
9.0 - 13.0 years
32 - 40 Lacs
Ahmedabad
Remote
About the Role: We are looking for a hands-on AWS Data Architect OR Lead Engineer to design and implement scalable, secure, and high-performing data solutions. This is an individual contributor role where you will work closely with data engineers, analysts, and stakeholders to build modern, cloud-native data architectures across real-time and batch pipelines. Experience: 715 Years Location: Fully Remote Company: Armakuni India Key Responsibilities: Data Architecture Design: Develop and maintain a comprehensive data architecture strategy that aligns with the business objectives and technology landscape. Data Modeling: Create and manage logical, physical, and conceptual data models to support various business applications and analytics. Database Design: Design and implement database solutions, including data warehouses, data lakes, and operational databases. Data Integration: Oversee the integration of data from disparate sources into unified, accessible systems using ETL/ELT processes. Data Governance: Implemented enforce data governance policies and procedures to ensure data quality, consistency, and security. Technology Evaluation: Evaluate and recommend data management tools, technologies, and best practices to improve data infrastructure and processes. Collaboration: Work closely with data engineers, data scientists, business analysts, and other stakeholders to understand data requirements and deliver effective solutions. Trusted by the worlds leading brands Documentation: Create and maintain documentation related to data architecture, data flows, data dictionaries, and system interfaces. Performance Tuning: Optimize database performance through tuning, indexing, and query optimization. Security: Ensure data security and privacy by implementing best practices for data encryption, access controls, and compliance with relevant regulations (e.g., GDPR, CCPA) Required Skills: Helping project teams with solutions architecture, troubleshooting, and technical implementation assistance. Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, Oracle, SQL Server). Minimum7to15 years of experience in data architecture or related roles. Experience with big data technologies (e.g., Hadoop, Spark, Kafka, Airflow). Expertise with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services. Knowledge of data integration tools (e.g., Informatica, Talend, FiveTran, Meltano). Understanding of data warehousing concepts and tools (e.g., Snowflake, Redshift, Synapse, BigQuery). Experience with data governance frameworks and tools.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough