Jobs
Interviews

3301 Big Data Jobs - Page 16

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 6.0 years

0 Lacs

indore, madhya pradesh

On-site

At ClearTrail, you will be part of a team dedicated to developing solutions that empower those focused on ensuring the safety of individuals, locations, and communities. For over 23 years, ClearTrail has been a trusted partner of law enforcement and federal agencies worldwide, committed to safeguarding nations and enhancing lives. We are leading the way in the future of intelligence gathering through the creation of innovative artificial intelligence and machine learning-based lawful interception and communication analytics solutions aimed at addressing the world's most complex challenges. We are currently looking for a Big Data Java Developer to join our team in Indore with 2-4 years of experience. As a Big Data Java Developer at ClearTrail, your responsibilities will include: - Designing and developing high-performance, scalable applications using Java and big data technologies. - Building and maintaining efficient data pipelines for processing large volumes of structured and unstructured data. - Developing microservices, APIs, and distributed systems. - Experience working with Spark, HDFS, Ceph, Solr/Elasticsearch, Kafka, and Delta Lake. - Mentoring and guiding junior team members. If you are a problem-solver with strong analytical skills, excellent verbal and written communication abilities, and a passion for developing cutting-edge solutions, we invite you to join our team at ClearTrail and be part of our mission to make the world a safer place.,

Posted 1 week ago

Apply

5.0 - 16.0 years

0 Lacs

karnataka

On-site

If you are passionate about driving data governance in a complex, global, and highly collaborative environment, this opportunity in Bangalore is for you. As an experienced professional with 10-16 years of experience, you will be an individual contributor with significant community engagement. In this role, your key accountabilities will revolve around data governance, where you will define and operationalize a federated data governance model across Business Domains. You will be responsible for driving cultural transformation to normalize data sharing and responsible utilization within TRD. Additionally, you will govern, monitor, and advise on data assets, as well as lead data-related product life cycle processes with Product Managers. You will also play a crucial role in the data platform by defining and executing strategies for data collection, storage, and sharing. Representing the TRD unit in transversal engineering initiatives and ensuring data quality standards and compliance with privacy frameworks will be part of your responsibilities. Furthermore, fostering cross-Business Line collaboration in data management will be essential. In terms of data acquisition and monetization, you will identify and steer transversal data opportunities and innovations. Your educational background should include a Masters (preferred) or Bachelors in IT/Data Science/Engineering, along with a minimum of 5 years of experience in a complex, global, or matrix organization. Proven Data Governance experience is a must-have for this role. Your technical and functional skills should encompass Big Data, Cloud, ML/AI, and Data Management. Familiarity with data governance tools, with Collibra preferred, and experience with the SAFe Agile Framework are essential. Data analytics experience, such as with Qlik and PowerBI, will be a plus. Proficiency in Product Management and Stakeholder Management is also required. Key attributes that will contribute to your success in this role include flexibility, proactivity, strong negotiation, and stakeholder engagement skills. A customer-centric mindset, along with budget and planning capabilities, will be valued in this position.,

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

maharashtra

On-site

As a Cloud & AI Solution Engineer at Microsoft, you will be part of a dynamic team that is at the forefront of innovation in the realm of databases and analytics. Your role will involve working on cutting-edge projects that leverage the latest technologies to drive meaningful impact for commercial customers. If you are insatiably curious and deeply passionate about tackling complex challenges in the era of AI, this is the perfect opportunity for you. In this role, you will play a pivotal role in helping enterprises unlock the full potential of Microsoft's cloud database and analytics stack. You will collaborate closely with engineering leaders and platform teams to accelerate the Fabric Data Platform, including Azure Databases and Analytics. Your responsibilities will include hands-on engagements such as Proof of Concepts, hackathons, and architecture workshops to guide customers through secure, scalable solution design and accelerate database and analytics migration into their deployment workflows. To excel in this position, you should have at least 10+ years of technical pre-sales or technical consulting experience, or a Bachelor's/Master's Degree in Computer Science or related field with 4+ years of technical pre-sales experience. You should be an expert on Azure Databases (SQL DB, Cosmos DB, PostgreSQL) and Azure Analytics (Fabric, Azure Databricks, Purview), as well as competitors in the data warehouse, data lake, big data, and analytics space. Additionally, you should have experience with cloud and hybrid infrastructure, architecture designs, migrations, and technology management. As a trusted technical advisor, you will guide customers through solution design, influence technical decisions, and help them modernize their data platform to realize the full value of Microsoft's platform. You will drive technical sales, lead hands-on engagements, build trusted relationships with platform leads, and maintain deep expertise in Microsoft's Analytics Portfolio and Azure Databases. By joining our team, you will have the opportunity to accelerate your career growth, develop deep business acumen, and hone your technical skills. You will be part of a collaborative and creative team that thrives on continuous learning and flexible work opportunities. If you are ready to take on this exciting challenge and be part of a team that is shaping the future of cloud Database & Analytics, we invite you to apply and join us on this journey.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Senior Database Administrator at Ideapoke, you will play a crucial role in ensuring the performance, integrity, and security of our information systems databases. Your responsibilities will include providing database design, development, and maintenance services, working on challenging problems in a fast-paced production environment, and managing production databases to support the development teams effectively. You will be responsible for database capacity planning, performance tuning, and system optimization. Additionally, you will participate in off-hours and weekend maintenance activities as required, monitor and improve database system performance and uptime, and assist developers with query tuning and schema refinement. Creating and maintaining up-to-date documentation of systems and processes, preparing statistical information for internal and external use, and providing technical mentorship to team members are also key aspects of your role. To excel in this position, you should have a B.E/B.Tech/MCA in computer science or a related field, along with at least 5 years of experience in database administration or a related field. Possessing MCSE/MCSA certifications and experience with database technologies such as MySQL, MS SQL, PostgreSQL, Oracle, and MongoDB is preferred. Familiarity with Linux and Windows Server environments, cloud services like AWS/Microsoft Azure, and programming languages including PL/SQL is essential. Experience with Oracle RAC, SQL Server, or MySQL, as well as a good understanding of data and schema standards, database design, troubleshooting, and maintenance, are also valuable assets. Moreover, you should demonstrate expertise in SQL, possess big data knowledge as an added advantage, exhibit a sense of ownership and pride in your work, and showcase critical thinking, problem-solving, time-management, interpersonal, and communication skills. Join Ideapoke and become an integral part of our journey towards innovation and success.,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

The company Ideapoke is a global, fast-growing start-up with a presence in Bengaluru, Bay Area, Tokyo, and Shanghai. Their software, search, and insights drive innovation for Fortune 500 and Global 2000 companies worldwide. Ideapoke's growth is attributed to its dedicated team, committed to the company's vision and characterized by a strong work ethic and an entrepreneurial spirit. The company values continuous learning, growth, and making a difference, inviting individuals to join their journey. As a Lead Artificial Intelligence at Ideapoke, you will be responsible for collaborating with researchers and engineers across various disciplines to develop advanced data analytic solutions. Your role will involve working on large datasets, mapping business requirements into AI products, and evaluating algorithm performance based on real-world data sets. You will be required to mine data from various sources, design and implement machine learning algorithms, and optimize existing algorithms for accuracy and speed. Additionally, you will be expected to research and implement technical solutions in deep learning for real-world challenges. To excel in this role, you are required to hold a Ph.D., Master's degree, B.Tech, or B.E. in Computer Science, Statistics, Mathematics, Engineering, or related fields. You should possess 10 to 12 years of academic or professional experience in Artificial Intelligence, Data Analytics, Machine Learning, Natural Language Processing, or related areas. Technical proficiency in Python, Java, R, XML parsing, Big Data, NoSQL, and SQL is essential for this position. Moreover, you should have a strong mathematical and statistical background, enabling you to understand algorithms and methods from a mathematical and intuitive perspective. The ideal candidate for this role will be a self-starter with the ability to manage multiple research projects. You should have a flexible approach to learning new skills, be a team player, and possess strong communication skills. Your role will involve establishing scalable and efficient processes for model development, validation, implementation, and large-scale data analysis in a distributed cloud environment. By joining Ideapoke, you will contribute to the company's innovative culture and work towards amplifying success for both the organization and its clients.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

chennai, tamil nadu

On-site

The Solution Engineer role at our company involves working across various technology domains such as network, compute, storage, databases, applications, and business. Your responsibilities will include utilizing Big Data and AI features to create solutions that enhance customer application experiences. You will be designing solutions, validating them with internal and customer teams, and building strong relationships with customers and experts. Understanding application architectures and workflows will be crucial to identify key touchpoints and metrics for monitoring and analysis. You will be performing data analytics and providing valuable insights and recommendations based on the data. Effective communication with customers, meticulous planning, and successful platform implementation at customer sites will be part of your routine tasks. Collaborating with the product team to develop new features, addressing solution gaps, and showing an eagerness to learn new technologies like Big Data, NoSQL databases, Elastic search, Mongo DB, and DevOps will be essential. The ideal candidate will have a B.E/Diploma in Computer Science/Information Technology with at least 3 years of relevant experience. Key requirements for this role include a minimum of 2 years of experience in IT Infrastructure Management, particularly in implementing and/or using monitoring tools. You should have hands-on experience with NPM, APM tools, and scripting. Knowledge or experience with ELK stack for monitoring purposes will be seen as an added advantage. This position is based in Chennai & Mumbai.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

As a Solution Architect, you will be responsible for developing effective IT solutions that meet customer needs and constraints. You will ensure the overall robustness of designs across functional and non-functional parameters, including performance, security, usability, and scalability. A strong spirit of innovation and the ability to provide solutions to complex problems independently is crucial for this role. In this position, you will proactively develop reusable frameworks and assets to enhance the competitive advantage of the solution. It is essential to maintain a comprehensive understanding of the customer enterprise systems landscape, dependencies, organizational goals, and technologies. You will analyze current processes and practices, suggesting and driving improvements as needed. Collaboration with stakeholders is key to achieving objectives, and you will be expected to analyze industry standard products and technologies, making appropriate recommendations through proof of concepts or demonstrations. Acting as a subject matter expert, you will respond to sales leads, bid requests, and new opportunities from customers. Presenting designs, solutions, and applications to customers and influencing them to win bids and opportunities are also integral aspects of this role. Furthermore, you will share your experience, knowledge, and best practices within the organization. The ideal candidate should have a minimum of 8 years of IT experience and proficiency in designing and architecting large-scale distributed systems, preferably in the Airline, Travel, or Hospitality domain. Sound knowledge of architecture and design patterns, along with a deep understanding of the latest technologies and tools such as big data analytics, cloud, and mobility, is required. Excellent communication and customer interfacing skills, as well as experience in requirement management, sizing, and solutioning, are essential for success in this role. The ability to understand, interpret, and influence client needs, requirements, expectations, and behavior is critical. Experience in the full project/product development cycle and supporting applications in a production environment is highly desirable. Your role as a Solution Architect in this IT/Computers-Software industry position will involve solution designing, Java, J2EE, Big Data, Airline, Aviation, Travel, Logistics, Architecture and Designing, and OOAD. A degree in B.Sc, B.Com, M.Sc, MCA, B.E, or B.Tech would be beneficial for this role. If you are interested in this opportunity, please send your resume to jobs@augustainfotech.com.,

Posted 1 week ago

Apply

10.0 - 15.0 years

0 Lacs

chennai, tamil nadu

On-site

The Applications Development Senior Lead position is a senior developer role where you will manage a team or department to establish and implement new or revised application systems and programs in coordination with the Technology team. Your main goal in this role is to drive application systems analysis and programming activities. As a Senior Lead, your responsibilities will include managing one or more Applications Development teams to achieve established goals, conducting personnel duties for the team such as performance evaluations, hiring, and disciplinary actions. You will utilize your in-depth knowledge and skills across multiple Applications Development areas to provide technical oversight, review and analyze proposed technical solutions, and contribute to the formulation of strategies for applications development and other functional areas. Additionally, you will develop a comprehensive understanding of how different areas of business integrate to achieve business goals and provide evaluative judgment based on analysis of factual data in complex situations. You will impact the Applications Development area by monitoring the delivery of end results, participating in budget management, and handling day-to-day staff management issues. This includes resource management, allocation of work within the team/project, ensuring essential procedures are followed, defining standards, and negotiating with external parties when necessary. You will assess risks when making business decisions and demonstrate consideration for the firm's reputation and compliance with laws and regulations. To qualify for this role, you should have 10-15 years of relevant experience in the Financial Service industry, experience in designing and delivering complex multi-system projects, proven experience in solution design and architecture using technologies like Microservices, big data, and Java. Additionally, you should have strong hands-on experience with Java, Spring Boot, Database, JDBC, JMS, Rest, Big Data/Distributed System, etc. Experience with ALM and CICD tools, leading project solution architecture design, and working with both waterfall and Agile methodologies are also required. Leadership and project management skills, clear communication, and experience in a banking/finance environment would be preferable. The ideal candidate will have a Bachelor's degree or equivalent experience, with a Master's degree being preferred. This job description provides a high-level overview of the work performed, and other job-related duties may be assigned as required. If you require a reasonable accommodation due to a disability, please review Accessibility at Citi.,

Posted 1 week ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Vadodara

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 week ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Agra

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 week ago

Apply

7.0 - 9.0 years

8 - 14 Lacs

Agra

Work from Office

Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards

Posted 1 week ago

Apply

7.0 - 9.0 years

8 - 14 Lacs

Vadodara

Work from Office

Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards

Posted 1 week ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Bharuch

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 1 week ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Surendranagar

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 1 week ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Mehsana

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 1 week ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Vadodara

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 1 week ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Surat

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 1 week ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Rajkot

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 1 week ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Gandhinagar

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 1 week ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Bhavnagar

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 1 week ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Jamnagar

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 1 week ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Ahmedabad

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 1 week ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Nagapattinam

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 1 week ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Cuddalore

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 1 week ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Kanyakumari

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies