Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
guwahati, assam
On-site
We are looking for a highly skilled Software Engineer with a strong expertise in Python and a solid understanding of data engineering principles to join our team. As a Software Engineer, you will be responsible for developing and optimizing scalable applications and data workflows, integrating diverse data sources, and supporting the development of data-driven products. This role requires hands-on experience in software development, data modeling, ETL/ELT pipelines, APIs, and cloud-based data systems. You will collaborate closely with product, data, and engineering teams to build high-quality, maintainable, and efficient solutions that support analytics, machine learning, and business intelligence initiatives. In this role, your responsibilities will include software development tasks such as designing, developing, and maintaining Python-based applications, APIs, and microservices with a strong focus on performance, scalability, and reliability. You will also write clean, modular, and testable code following best software engineering practices, participate in code reviews, debugging, and optimization of existing applications, and integrate third-party APIs and services as required for application features or data ingestion. On the data engineering side, you will be building and optimizing data pipelines (ETL/ELT) for ingesting, transforming, and storing structured and unstructured data. You will work with relational and non-relational databases to ensure efficient query performance and data integrity, collaborate with the analytics and ML teams to ensure data availability, quality, and accessibility for downstream use cases, and implement data modeling, schema design, and version control for data pipelines. Additionally, you will be involved in deploying and managing solutions on cloud platforms (AWS/Azure/GCP) using services such as S3, Lambda, Glue, BigQuery, or Snowflake, implementing CI/CD pipelines, and participating in DevOps practices for automated testing and deployment. You will also monitor and optimize application and data pipeline performance using observability tools. Furthermore, you will work cross-functionally with software engineers, data scientists, analysts, and product managers to understand requirements and translate them into technical solutions. You will provide technical guidance and mentorship to junior developers and data engineers as needed and document architecture, code, and processes to ensure maintainability and knowledge sharing. The ideal candidate should have a Bachelors/Masters degree in Computer Science, Engineering, or a related field, along with 3+ years of experience in Python software development. Strong knowledge of data structures, algorithms, and object-oriented programming is required, as well as hands-on experience in building data pipelines. Proficiency with SQL and database systems, experience with cloud services and containerization, familiarity with message queues/streaming platforms, a strong understanding of APIs, RESTful services, and microservice architectures, and knowledge of CI/CD pipelines, Git, and testing frameworks are also desirable.,
Posted 11 hours ago
8.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Technical Account Manager (TAM) at our company, your primary focus will be on ensuring the success of our customers by delivering high-value services. You will be responsible for overseeing governance and technical service delivery to help customers maximize the business value of their Oracle investments and achieve their desired outcomes while minimizing risks. Building trust as a key advisor, ensuring consistency and quality, helping customers align their IT strategies, overcoming challenges, and leveraging leading practices for successful Oracle technology and Cloud deployments will be essential aspects of your role. The services portfolio you will manage includes a range of offerings like Managed Services, On-Premise, Hybrid Cloud, Applications, Platforms, Databases (SaaS/PaaS/IaaS), and Security services. Your expertise should include strong troubleshooting skills on Database and related technology products, proficiency in analyzing and resolving performance issues, guidance on Oracle Database Best practices, knowledge of Database Security Products such as Transparent Data Encryption, Redaction, Data Vault, and Masking. You should also possess troubleshooting skills on Real Application Cluster, ability to mentor a team of engineers, and articulate the benefits of Advanced Compression and In-memory functionalities to customers. Additional knowledge on Oracle Enterprise Manager will be valuable. In terms of personal skills, you should have a solid background in service delivery or project management, familiarity with Oracle products and services, experience with Oracle HW platforms and OS, and a track record of working with Enterprise Customers. Excellent communication and relationship-building skills, a customer-focused mindset, ability to excel under pressure, strong organizational skills, decision-making abilities, and proficiency in managing multiple concurrent activities are crucial for success in this role. As a Database Administrator, you will be responsible for tasks such as monitoring, analyzing, and optimizing database performance in a RAC Environment, implementing DR solutions using Oracle Standby Database, managing data backup processes, conducting database tuning, analyzing diagnostic tools, managing Oracle database instances, identifying performance bottlenecks, and collaborating with development teams to optimize SQL code and schema design. To qualify for this position, you should hold a Bachelor's degree in computer science, IT, or a related field, possess Oracle Database certifications (e.g., OCA, OCP), and have 8-12 years of hands-on experience in Oracle Database administration and support. The role requires working in a 24*7 shift from the client site in Kerala with no remote or WFH options available. Your technical skills should include in-depth knowledge of Oracle Database architecture, proficiency in SQL, PL/SQL, and database performance tuning, experience with Oracle Real Application Clusters, Data Guard, ASM, and RMAN, basic familiarity with Cloud platforms such as OCI, AWS, or AZURE, expertise with Oracle Enterprise Manager and monitoring tools, and understanding of database security principles like encryption and user management.,
Posted 13 hours ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
You are an experienced Senior Data Analyst with a minimum of 7-8 years of experience in data analysis roles, specifically with significant exposure to Snowflake. Your primary responsibilities will include querying and analyzing data stored in Snowflake databases to derive meaningful insights for supporting business decision-making. You will also be responsible for developing and maintaining data models and schema designs within Snowflake to facilitate efficient data analysis. In addition, you will create and maintain data visualizations and dashboards using tools like Tableau or Power BI, leveraging Snowflake as the underlying data source. Collaboration with business stakeholders to understand data requirements and translate them into analytical solutions is a key aspect of this role. You will also perform data validation, quality assurance, and data cleansing activities within Snowflake databases. Furthermore, you will support the implementation and enhancement of ETL processes and data pipelines to ensure data accuracy and completeness. A Bachelor's or Master's degree in Data Science, Statistics, Computer Science, Information Systems, or a related field is required. Certifications in data analytics, data visualization, or cloud platforms are desirable but not mandatory. Your primary skills should encompass a strong proficiency in querying and analyzing data using Snowflake SQL and DBT. You must have a solid understanding of data modeling and schema design within Snowflake environments. Experience in data visualization and reporting tools such as Power BI, Tableau, or Looker is essential for analyzing and presenting insights derived from Snowflake. Familiarity with ETL processes and data pipeline development is also crucial, along with a proven track record of using Snowflake for complex data analysis and reporting tasks. Strong problem-solving and analytical skills, including the ability to derive actionable insights from data, are key requirements. Experience with programming languages like Python or R for data manipulation and analysis is a plus. Secondary skills that would be beneficial for this role include knowledge of cloud platforms and services such as AWS, Azure, or GCP. Excellent communication and presentation skills, strong attention to detail, and a proactive approach to problem-solving are also important. The ability to work collaboratively in a team environment is essential for success in this position. This role is for a Senior Data Analyst specializing in Snowflake, based in either Trivandrum or Bangalore. The working hours are 8 hours per day from 12:00 PM to 9:00 PM, with a few hours of overlap during the EST time zone for mandatory meetings. The close date for applications is 18-04-2025.,
Posted 2 days ago
4.0 - 8.0 years
0 Lacs
haryana
On-site
As a Data Engineer at GreyOrange, you will be responsible for designing, developing, and maintaining ETL pipelines to ensure efficient data flow for high-scale data processes. Your primary focus will be on managing and optimizing data storage and retrieval in Google BigQuery, while ensuring performance efficiency and cost-effectiveness. Additionally, you will be setting up quick analytics dashboards using tools like Metabase, Looker, or any other preferred platform. Collaboration with internal analysts and stakeholders, including customers, is key in understanding data needs and implementing robust solutions. You will play a crucial role in monitoring, troubleshooting, and resolving data pipeline issues to maintain data integrity and availability. Implementing data quality checks and maintaining data governance standards across the ETL processes will also be part of your responsibilities. Automation will be a significant aspect of your role, where you will develop scripts for repetitive tasks and optimize manual processes. Documenting the entire analytics implementation and data structures will ensure a guide for all users. Staying updated with industry best practices and emerging technologies in data engineering and cloud-based data management is essential. In addition to the responsibilities, the following requirements are necessary for this role: - 4+ years of experience as a Data Engineer or in a similar role - Strong experience with ETL tools and frameworks such as Apache Airflow, Dataflow, Estuary - Proficiency in SQL and extensive experience with Google BigQuery - Setting up analytics dashboards using tools like Looker, Metabase - Knowledge of data warehousing concepts and best practices - Experience with cloud platforms, particularly Google Cloud Platform (GCP) - Strong analytical and problem-solving skills focusing on cost optimization in cloud environments - Familiarity with Python or other scripting languages for automation and data processing - Excellent communication skills and the ability to work collaboratively in a team environment - Experience with data modeling and schema design As a Data Engineer at GreyOrange, you will have the opportunity to provide guidance and mentorship to junior data engineers and data analysts when needed.,
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
jaipur, rajasthan
On-site
We are seeking a Data Cloud Integration Specialist to oversee the management of data ingestion, unification, and activation in Salesforce Data Cloud and external systems. As the ideal candidate, you will need to have practical experience in constructing robust integration pipelines, collaborating with APIs, and facilitating a seamless data flow among various platforms. Your responsibilities will include designing and executing data ingestion workflows from diverse sources into Salesforce Data Cloud, harmonizing data from multiple systems to establish a comprehensive 360 customer view, and establishing and sustaining integrations utilizing APIs, ETL tools, and middleware solutions. You will work closely with data architects, developers, and business teams to acquire integration requirements, monitor and enhance integration performance for accuracy and real-time data availability, and ensure compliance with data privacy and governance policies. Additionally, you will be responsible for activating unified data for utilization across Salesforce Marketing, Sales, and Service Clouds. Key Requirements: - Proficient in Salesforce Data Cloud (formerly Salesforce CDP). - Strong expertise in ETL processes, data mapping, and transformation logic. - Hands-on experience with REST/SOAP APIs and integration tools like MuleSoft or equivalent. - Sound understanding of data modeling, schema design, and customer data platforms. - Familiarity with data privacy regulations such as GDPR and CCPA. Preferred Qualifications: - Familiarity with Salesforce Marketing Cloud, Service Cloud, and Customer 360. - Experience with cloud data platforms like Snowflake, Redshift, or BigQuery. - Possession of Salesforce certifications like Data Cloud Consultant or Integration Architect is advantageous. This is a Contractual / Temporary position with a work schedule of Monday to Friday. The ideal candidate should have at least 6 years of experience as a Data Integration Specialist and 5 years of experience with Salesforce Data Cloud. Work Location: In person,
Posted 3 days ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
As a GCP Developer, you will be responsible for maintaining the stability of production platforms, delivering new features, and minimizing technical debt across various technologies. You should have a minimum of 4 years of experience in the field. You must have a strong commitment to maintaining high standards and a genuine passion for ensuring quality in your work. Proficiency in GCP, Python, Hadoop, Spark, Cloud, Scala, Streaming (pub/sub), Kafka, SQL, Data Proc, and Data Flow is essential for this role. Additionally, familiarity with data warehouses, distributed data platforms, and data lakes is required. You should possess knowledge in database definition, schema design, Looker Views, and Models. An understanding of data structures and algorithms is crucial for success in this position. Experience with CI/CD practices would be advantageous. This position involves working in a dynamic environment across multiple locations such as Chennai, Hyderabad, and Bangalore. A total of 20 positions are available for qualified candidates.,
Posted 4 days ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
Discover a workplace with an inclusive culture and abundant career growth opportunities at TJX India, a Fortune 100 NYSE-listed company. Join our truly global IT Organization that collaborates seamlessly across North America, Europe, Asia, and Australia in a challenging, collaborative and a team-based environment. Our team consists of over 50 Associates located across the globe, with plans to increase it now through expansion in our India office. Expanding TJX's presence in India will bring knowledge and expertise, local leadership and decision making, enabling us to deliver the right work, at the right time, in the right way. Over the next 12 months, our Indian presence will be the largest representation of the team in any global geography. As a Data Analyst within the Merchandise Operations Management team, you will dive deep into TJX's vast financial and inventory data, working to demonstrate the integrity of MOM data and uncover new insights for our business partners. You'll serve as an ambassador for MOM data; working with Art team members and external IT partners to generate actionable and trustworthy insights. Beyond that, you will play a pivotal role in the modernization of the TJX RPM, ReSA, and ReIM applications. Together, we will create and maintain data integrity models and dynamic dashboards, fueling our modernization and strategic growth initiatives with data-driven insights. Responsibilities include: - Contribute to creating and executing validation models quantifying and explaining differences between a variety of datasets. - Analyze complex valuation processes within the merchandise and purchase order financial landscapes. - Collaborate with both IT and Business teams to extract meaningful insights to support and substantiate the implementation of the Oracle MOM product suite. - Communicate findings effectively to business partners, facilitating understanding and meaningful data-driven decision-sharing. - Support ad-hoc analysis, custom reporting, model engineering to build trust with business partners. - Support the MOM ART through data prep, cleaning, and reporting of performance metrics. - Support the Solution Delivery management team in defining, measuring, and reporting on key system stability metrics. - Create, update, and improve various production data models and reports. - Support automation activities to streamline data-driven insights and provide operational efficiencies. Minimum qualifications required: - Bachelor's Degree in technology or Information systems. - At least 6+ years of overall Industry experience. - 3+ Years of data analytics experience. - Proficient/Advanced levels in Microsoft Power Platform suite, Excel, Power BI, Power Query. - Strong Mathematical Acumen for quantitative analysis. - Expertise in database querying and design. - Proficient in data modeling and schema design. - Skilled problem solver with the ability to think outside of the box and embrace new ways of thinking. - Adaptable to fast-paced environments. - Interpersonal and relationship-building skills. - Must have attention to detail and sound analytical skills. - Proficiency in Dax and M Language. - Familiarity with and ability to construct sound design of experiments. - Proficiency with Exploratory Data Analysis. - Must have curiosity and ability to be inquisitive. Preferred Qualifications: - Knowledge in Oracle Retail products (RMS, ReSA, RPM, ReIM). - Certifications in Data Analytics space. Join TJX India and contribute to the company's growth to $60B, showcasing your team leadership skills and fostering a collaborative and innovative engineering environment. Be part of a team that shapes the future of merchandise operations on a global scale.,
Posted 4 days ago
1.0 - 5.0 years
0 Lacs
ahmedabad, gujarat
On-site
As a Data Engineer at Synoptek, you will be responsible for designing, developing, and maintaining robust and scalable data pipelines on the Google Cloud Platform (GCP). You will leverage your hands-on experience with GCP services such as BigQuery, Jitterbit, Cloud Dataflow, Cloud Pub/Sub, and Cloud Storage to build efficient data processing solutions. Collaborating with cross-functional teams, you will translate their data needs into technical requirements, ensuring data quality, integrity, and security throughout the data lifecycle. Your role will involve developing and optimizing ETL/ELT processes to extract, transform, and load data from various sources into data warehouses and data lakes. Additionally, you will build and maintain data models and schemas to support business intelligence and analytics, while troubleshooting data quality issues and performance bottlenecks. To excel in this position, you should have a Bachelor's degree in Computer Science, Engineering, or a related field, along with 3 to 4 years of experience as a Data Engineer focusing on GCP. Proficiency in Python, SQL, and BigQuery is essential, as well as hands-on experience with data ingestion, transformation, and loading tools like Jitterbit and Apache Beam. A strong understanding of data warehousing and data lake concepts, coupled with experience in data modeling and schema design, will be beneficial. The ideal candidate will exhibit excellent problem-solving and analytical skills, working both independently and collaboratively with internal and external teams. Familiarity with acquiring and managing data from various sources, as well as the ability to identify trends in complex datasets and propose business solutions, are key attributes for success in this role. At Synoptek, we value employees who embody our core DNA behaviors, including clarity, integrity, innovation, accountability, and a results-focused mindset. We encourage continuous learning, adaptation, and growth in a fast-paced environment, promoting a culture of teamwork, flexibility, respect, and collaboration. If you have a passion for data engineering, a drive for excellence, and a commitment to delivering impactful results, we invite you to join our dynamic team at Synoptek. Work hard, play hard, and let's achieve superior outcomes together.,
Posted 4 days ago
8.0 - 14.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Mongo Database Administrator at our company, you will play a crucial role in designing, implementing, and maintaining MongoDB databases. With 8-14 years of experience under your belt, you will be responsible for ensuring the performance, availability, and security of our MongoDB instances. Your expertise in database design, data modeling, performance tuning, and optimization will be key to your success in this role. Key Responsibilities: - Design and implement MongoDB database solutions to support business applications. - Ensure high levels of performance, availability, sustainability, and security. - Analyze and resolve database performance issues. - Develop and maintain database standards and documentation. - Collaborate with development teams to design and optimize database queries. - Perform database tuning and maintenance activities. - Implement backup and recovery strategies. - Monitor database performance and provide recommendations for improvements. - Stay updated with the latest MongoDB features and best practices. Qualifications: - Bachelor's degree in Computer Science, Information Technology, or a related field. - Proven experience as a MongoDB Database Architect or similar role. - In-depth knowledge of MongoDB architecture and design. - Experience with database performance tuning and optimization. - Strong understanding of data modeling and schema design. - Proficiency in SQL and NoSQL databases. - Familiarity with cloud-based database solutions (e.g., AWS, Azure). - Excellent problem-solving and analytical skills. - Strong communication and teamwork abilities. Preferred Qualifications: - MongoDB certification. - Experience with other NoSQL databases (e.g., Cassandra, Couchbase). - Knowledge of scripting languages (e.g., Python, Bash). - Experience with DevOps practices and tools.,
Posted 4 days ago
3.0 - 8.0 years
0 Lacs
noida, uttar pradesh
On-site
You should have a total of 8+ years of development/design experience, with a minimum of 3 years experience in Big Data technologies on-prem and on cloud. Proficiency in Snowflake and strong SQL programming skills are required. In addition, you should have strong experience with data modeling and schema design, as well as extensive experience using Data warehousing tools like Snowflake, BigQuery, or RedShift. Experience with BI Tools like Tableau, QuickSight, or PowerBI is a must, with at least one tool being a requirement. You should also have strong experience implementing ETL/ELT processes and building data pipelines, including workflow management, job scheduling, and monitoring. A good understanding of Data Governance, Security and Compliance, Data Quality, Metadata Management, Master Data Management, and Data Catalog is essential. Moreover, you should have a strong understanding of cloud services (AWS or Azure), including IAM and log analytics. Excellent interpersonal and teamwork skills are necessary for this role, as well as experience with leading and mentoring other team members. Good knowledge of Agile Scrum and communication skills are also required. As part of the job responsibilities, you will be expected to perform the same tasks as mentioned above. At GlobalLogic, we prioritize a culture of caring. From day one, you will experience an inclusive culture of acceptance and belonging, where you will have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. We are committed to your continuous learning and development. You will learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you will have the chance to work on projects that matter and engage your curiosity and creative problem-solving skills. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. GlobalLogic is a high-trust organization where integrity is key. By joining us, you are placing your trust in a safe, reliable, and ethical global company. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world's largest and most forward-thinking companies. Since 2000, we have been at the forefront of the digital revolution, helping create innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.,
Posted 1 week ago
7.0 - 9.0 years
7 - 17 Lacs
Coimbatore
Work from Office
Role Overview As the Technical Lead, you will take ownership of our technology strategy, system architecture, and engineering execution. You'll wear multiple hats from designing real-time trading systems and market data pipelines to mentoring engineers, managing infrastructure, and collaborating with founders on product direction. This role is ideal for someone who thrives in fast-paced, high-ownership environments, has full-stack and systems-level expertise, and can bridge product vision with scalable, maintainable tech execution in the financial technology domain. Responsibilities: Technical Leadership - Own and evolve the full technology stack: backend, frontend, infrastructure, and DevOps - Architect scalable, low-latency systems for real-time data ingestion, analytics, and strategy execution - Conduct detailed code and architecture reviews to ensure quality, performance, and security - Mentor and support engineers; lead by example with high engineering standards Product & Business Alignment - Collaborate with founders and business teams to translate product vision into technical milestones - Align technical decisions with product, compliance, and trading goals - Evaluate trade-offs between speed, cost, and long-term maintainability Team & Culture - Mentor and support engineers; lead by example with high engineering standards - Promote a collaborative, ownership-driven engineering environment Must-Have Skills: - Languages: Expert in at least one of Scala, Ruby on Rails, Node.js, React.js, Type Script - Databases: Proficient with Relational (PostgreSQL, MySQL) and NoSQL (Mongo DB, Redis, Cassandra, Rocks DB) - Experience with time-series databases (e.g., Timescale DB) is a strong plus - DevOps & Infra: Hands-on with Docker, Kubernetes, CI/CD pipelines - Cloud: Experience with AWS, Digital Ocean, or similar cloud platforms - Architecture: Strong experience in designing Micro Services, event-driven systems and database schema design - Soft Skills: Excellent communication, PR/code reviews, and cross-functional leadership Good to Have Skills: - Start-up experience or having played a founding/leadership engineering role - Exposure to UI/UX thinking and frontend frameworks beyond React - Knowledge of system design for scale, performance, and security - Experience in mentoring and building diverse engineering teams Why Join Us: - Be part of a mission-driven Fintech Start up at a transformative stage - Take full ownership of the tech stack and influence product direction - Collaborate with passionate founders and a lean, high-performance team - Work on cutting-edge financial systems that impact real traders and investors About Us: Simply Algo Fintech was founded in 2019 by a team of visionary founders with a shared passion for building intelligent algorithmic trading platforms. What began as a bootstrap venture has evolved into a cutting-edge fintech company empowering both retail and institutional traders through data-driven technology. We specialize in delivering seamless, user-friendly platforms that require no coding knowledge making algorithmic trading accessible to everyone. Our solutions are designed to simplify strategy development, testing, and execution with speed and precision. Website: https://www.simplyalgo.in/ LinkedIn: https://www.linkedin.com/company/simply-algo-fintech/?viewAsMember=true
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
haryana
On-site
The ideal candidate will be responsible for building the entire Backend platform for a product portfolio and ensuring end-to-end delivery of new features. You will be tasked with evolving the architecture for performance and scalability, designing, developing, and owning components of a highly scalable, distributed web services platform. It will be essential to constantly strive to improve the software development process and team productivity. Additionally, you will be expected to mentor and train team members and lead module development independently. In this role, it is crucial to demonstrate leadership and planning through proactive behavior and independent problem-solving. Strong collaboration with other test team members and development team members will be necessary to meet goals effectively. To be successful in this position, you should have at least 5.5 years of experience as a Lead Engineer in a scalable product/ecommerce organization. Proficiency in Java, a deep understanding of the Spring framework & MVC approach, and strong knowledge of performance optimization and caching techniques are required. A solid grasp of Object-Oriented Programming concepts, data structures, and algorithms is essential. Experience in developing scalable, fault-tolerant, distributed backend services, familiarity with prevalent design patterns and advanced system designing, and proficiency in databases and Schema design, particularly NoSQL databases, are highly valued. Strong problem-solving skills will also be crucial for excelling in this role.,
Posted 1 week ago
0.0 - 4.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Engineer, you will be responsible for designing, building, and maintaining data pipelines and infrastructure to support data-driven initiatives. Your primary focus will be to ensure efficient collection, storage, and processing of data for analysis and business use. Your key responsibilities will include designing, implementing, and optimizing end-to-end data pipelines for ingesting, processing, and transforming large volumes of both structured and unstructured data. You will also be tasked with building and maintaining data infrastructure that facilitates effective data utilization within organizations. In this role, you will design and maintain data models, schemas, and database structures to support various analytical and operational use cases. Ensuring data quality, accuracy, and security throughout the data lifecycle will be crucial to your success. Collaboration will be a key aspect of your work as you will be required to work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver tailored solutions. Problem-solving skills will also be essential as you identify and address data-related challenges to ensure data availability for analysis and decision-making. Remaining up-to-date with the latest data engineering technologies and tools will be necessary to excel in this role. The job types available for this position are full-time, permanent, and open to fresher candidates. The benefits offered include food provision, and the work schedule consists of day shifts with a morning shift timing. Additionally, a performance bonus will be provided. The work location for this role is in person.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
As an experienced professional with 8-12 years of experience in Database technologies, scripting, BI tools, and Data Science, you will be responsible for understanding customer requirements, documenting technical designs, and developing ETL jobs, reports, dashboards, and data marts. You will play a key role in translating business requirements into effective designs and implementing solutions using a variety of tools and technologies. Your responsibilities will include: - Understanding customer requirements, problem statements, and business use cases. - Documenting customer requirements and technical designs. - Developing and testing ETL jobs, query objects, data marts, reports, and dashboards. - Designing OLAP cubes, SQL reports, interactive charts, and dashboards. - Writing stored procedures, functions, packages, scripts, and complex SQL queries. - Reviewing existing report designs, database schemas, and SQL code to enhance performance and operational aspects. - Collaborating closely with customers for solution fine-tuning and troubleshooting production issues. - Monitoring tasks efficiently and providing status reports to senior management, customers, and stakeholders. - Demonstrating problem analysis and resolution skills, including debugging at the OS, database, or network level if necessary. - Possessing an understanding of cloud deployment architecture and Big Data platforms as an added advantage. To excel in this role, you must have hands-on experience with various database technologies such as Oracle, MS SQL Server, MySQL, and PostgreSQL. Additionally, proficiency in scripting languages like Java, Python, and R is required. Experience with contemporary reporting and BI tools like Crystal Reports, Tableau, and PowerBI is essential. Knowledge of ETL platforms like Informatica and SSIS, as well as data modeling and data warehouse design, is crucial. Your expertise in data science, AI, and ML will be utilized for designing and implementing data-driven use cases. Strong communication skills, problem-solving abilities, and familiarity with project management concepts and Agile delivery methodology are necessary for successful project execution. If you are seeking a challenging opportunity to leverage your technical skills and contribute to innovative solutions in the field of data analytics and business intelligence, this role offers a dynamic environment where you can thrive and make a significant impact.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
haryana
On-site
You will be responsible for building the entire Backend platform for a product portfolio and delivering new features end-to-end. Your role will involve evolving the architecture to improve performance and scalability. You will design, develop, and take ownership of components within a highly scalable, distributed web services platform. It is essential to continuously work towards enhancing the software development process and team productivity. Additionally, you will be expected to mentor and train team members and lead module development independently. Demonstrating leadership and proactive planning through independent problem-solving will be a key aspect of your role. Strong collaboration with both test team members and development team members is crucial to achieve common goals effectively. To excel in this position, you should have at least 5.5 years of experience as a Lead Engineer in a scalable product or ecommerce organization. Proficiency in Java, a solid understanding of the Spring framework, and familiarity with the MVC approach are essential. You should possess a strong knowledge of performance optimization and caching techniques, along with expertise in Object-Oriented Programming concepts, data structures, and algorithms. Experience in developing scalable, fault-tolerant, distributed backend services, as well as a good grasp of prevalent design patterns and advanced system designing, is required. Moreover, you should have practical experience with databases and Schema design, particularly with NoSQL databases. Strong problem-solving skills will also play a significant role in your success in this role.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
haryana
On-site
As a Backend Developer at our organization, you will be responsible for building the entire Backend platform for our product portfolio. Your role will involve the end-to-end delivery of new features, as well as evolving the architecture to ensure performance and scalability. You will design, develop, and own components of a highly scalable, distributed web services platform. Constantly striving to improve the software development process and team productivity will be a key part of your responsibilities. Additionally, you will mentor and train team members while leading module development independently. To be successful in this role, you should have at least 2.5+ years of experience in a scalable product/ecommerce organization as a Senior Software Engineer. Excellent Java skills, a deep understanding of the Spring framework, and proficiency in MVC approach are essential. You should possess strong knowledge of performance optimization and caching techniques, as well as solid grasp of Object-Oriented Programming concepts, data structures, and algorithms. Experience in developing scalable, fault-tolerant, distributed backend services, along with familiarity with prevalent design patterns and advanced system designing, is required. Proficiency in databases, especially in NoSQL databases and schema design, is crucial for this role. Strong problem-solving skills will also be beneficial in fulfilling your responsibilities effectively.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
You will be responsible for designing, developing, and maintaining enterprise-grade search solutions using Apache Solr and SolrCloud. Your key tasks will include developing and optimizing search indexes and schema for various use cases such as product search, document search, or order/invoice search. Additionally, you will be required to integrate Solr with backend systems, databases, and APIs, implementing features like full-text search, faceted search, auto-suggestions, ranking, and relevancy tuning. It will also be part of your role to optimize search performance, indexing throughput, and query response time for efficient results. Your expertise in Apache Solr & SolrCloud, along with a strong understanding of Lucene, inverted index, analyzers, tokenizers, and search relevance tuning will be essential for this position. Proficiency in Java or Python for backend integration and development is required, as well as experience with RESTful APIs, data pipelines, and real-time indexing. Familiarity with Zookeeper, Docker, Kubernetes for SolrCloud deployments, and knowledge of JSON, XML, and schema design in Solr will also be necessary. Furthermore, your responsibilities will include ensuring data consistency and high availability using SolrCloud and Zookeeper for cluster coordination & configuration management. You will be expected to monitor the health of the search system and troubleshoot any issues that may arise in production. Collaboration with product teams, data engineers, and DevOps teams will be crucial for ensuring smooth delivery. Staying updated with new features of Apache Lucene/Solr and recommending improvements will also be part of your role. Preferred qualifications for this position include a Bachelors or Masters degree in Computer Science, Engineering, or a related field. Experience with Elasticsearch or other search technologies will be advantageous, as well as working knowledge of CI/CD pipelines and cloud platforms such as Azure. Overall, your role will involve working on search solutions, optimizing performance, ensuring data consistency, and collaborating with cross-functional teams for successful project delivery.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
Join us as a Senior Developer at Barclays, where you will spearhead the evolution of the digital landscape, driving innovation and excellence. You will utilize cutting-edge technology to revolutionize digital offerings, ensuring unparalleled customer experiences. You will be assessed on critical skills relevant for success in the role, including experience with skills to meet business requirements and job-specific skillsets. To be successful as a Senior Developer, you should have experience with: Basic/ Essential Qualifications: - Graduate/Postgraduate with hands-on experience (8+ years) in Microsoft.Net, C#, ASP.Net MVC, Web Services, Web API, DotNet Core, JavaScript, jQuery, HTML5, CSS3. - Previous experience in RestFul services is a big plus. - Good knowledge and understanding of pricing different products. - Experience with relational database systems, schema design, SSIS, SQL (MSSQL), and stored procedures. - Strong general development practices such as OOAD, design patterns, continuous integration, unit testing, and Agile Process. - Structured approach to problem-solving and ability to manage parallel streams of work. - Experience with technologies supporting development, continuous integration, automated testing, and deployment. - Ability to mentor and guide junior team members. Desirable skillsets/ good to have: - Knowledge and experience with OpenShift and other cloud-based solutions is a plus. - UI framework expertise. This role will be based out of Pune. Purpose of the role: To design, develop, and improve software using various engineering methodologies that provide business, platform, and technology capabilities for customers and colleagues. Accountabilities: - Development and delivery of high-quality software solutions using industry-aligned programming languages, frameworks, and tools. - Collaborating with product managers, designers, and engineers to define software requirements, devise solution strategies, and ensure seamless integration with business objectives. - Participation in code reviews and promotion of a culture of code quality and knowledge sharing. - Staying informed of industry technology trends, contributing to the organization's technology communities, and fostering a culture of technical excellence and growth. - Adherence to secure coding practices and implementation of effective unit testing practices. Expectations for Assistant Vice President: - Advising and influencing decision-making, contributing to policy development, and ensuring operational effectiveness. - Leading a team to deliver work impacting the whole business function. - Setting objectives, coaching employees, and appraising performance relative to objectives. - Demonstrating leadership behaviours to create an environment for colleagues to thrive and deliver to an excellent standard. All colleagues are expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as the Barclays Mindset of Empower, Challenge, and Drive.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
As a Tech Lead, Data Architecture at Fiserv, you will play a crucial role in our data warehousing strategy and implementation. Your responsibilities will include designing, developing, and leading the adoption of Snowflake-based solutions to ensure efficient and secure data systems that drive our business analytics and decision-making processes. Collaborating with cross-functional teams, you will define and implement best practices for data modeling, schema design, and query optimization in Snowflake. Additionally, you will develop and manage ETL/ELT workflows to ingest, transform, and load data from various resources into Snowflake, integrating data from diverse systems like databases, APIs, flat files, and cloud storage. Monitoring and tuning Snowflake performance, you will manage caching, clustering, and partitioning to enhance efficiency while analyzing and resolving query performance bottlenecks. You will work closely with data analysts, data engineers, and business users to understand reporting and analytic needs, ensuring seamless integration with BI Tools like Power BI. Your role will also involve collaborating with the DevOps team for automation, deployment, and monitoring, as well as planning and executing strategies for scaling Snowflake environments as data volume grows. Keeping up to date with emerging trends and technologies in data warehousing and data management is essential, along with providing technical support, troubleshooting, and guidance to users accessing the data warehouse. To be successful in this role, you must have 8 to 10 years of experience in data management tools like Snowflake, Streamsets, and Informatica. Experience with monitoring tools like Dynatrace and Splunk, Kubernetes cluster management, and Linux OS is required. Additionally, familiarity with containerization technologies, cloud services, CI/CD pipelines, and banking or financial services experience would be advantageous. Thank you for considering employment with Fiserv. To apply, please use your legal name, complete the step-by-step profile, and attach your resume. Fiserv is committed to diversity and inclusion and does not accept resume submissions from agencies outside of existing agreements. Beware of fraudulent job postings not affiliated with Fiserv to protect your personal information and financial security.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
As a software developer in our team, you will play a crucial role in understanding customer needs by collaborating with product managers and business stakeholders. Your responsibilities will include designing, developing, delivering, and supporting large-scale and distributed software applications and tools in an agile, startup-like environment. You will be expected to prepare necessary documents such as flowcharts and workflows to identify requirements and solutions, as well as take initiatives to invent new solutions for our customers. Building the entire back-end platform for a product portfolio will be a key part of your role, along with designing and leading backend architecture implementation in an innovative environment. Ownership of features across the entire life cycle, from inception to deployment in production, will be essential. You will be encouraged to pick up new technologies and frameworks that best suit the needs of our products and users, while using software engineering best practices to ensure high standards of quality and maintainability for all deliverables. Your technical competencies should include proficiency in Go-Lang skills, understanding of Kubernetes and OCP, familiarity with CI/CD pipelines, Spring framework, MVC approach, performance optimization, caching techniques, databases, schema design, object-oriented programming concepts, data structures, and algorithms. Moreover, your leadership competencies should encompass customer obsession, collaboration, influence, ownership mindset, learning agility, navigating change, leaders building leaders, and execution excellence.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
You have a total of 4-6 years of development/design experience with a minimum of 3 years experience in Big Data technologies on-prem and on cloud. You should be proficient in Snowflake and possess strong SQL programming skills. Your role will require strong experience with data modeling and schema design, as well as extensive experience in using Data warehousing tools like Snowflake/BigQuery/RedShift and BI Tools like Tableau/QuickSight/PowerBI (at least one must be a must-have). You must also have experience with orchestration tools like Airflow and transformation tool DBT. Your responsibilities will include implementing ETL/ELT processes and building data pipelines, workflow management, job scheduling, and monitoring. You should have a good understanding of Data Governance, Security and Compliance, Data Quality, Metadata Management, Master Data Management, Data Catalog, as well as cloud services (AWS), including IAM and log analytics. Excellent interpersonal and teamwork skills are essential, along with the experience of leading and mentoring other team members. Good knowledge of Agile Scrum and communication skills are also required. At GlobalLogic, the culture prioritizes caring and inclusivity. Youll join an environment where people come first, fostering meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Continuous learning and development opportunities are provided to help you grow personally and professionally. Meaningful work awaits you at GlobalLogic, where youll have the chance to work on impactful projects and engage your curiosity and problem-solving skills. The organization values balance and flexibility, offering various career areas, roles, and work arrangements to help you achieve a perfect balance between work and life. GlobalLogic is a high-trust organization where integrity is key, ensuring a safe, reliable, and ethical global environment for all employees. Truthfulness, candor, and integrity are fundamental values upheld in everything GlobalLogic does. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner that collaborates with the world's largest and most forward-thinking companies. Leading the digital revolution since 2000, GlobalLogic helps create innovative digital products and experiences, transforming businesses and redefining industries through intelligent products, platforms, and services.,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
You are looking for a Data Modelling Consultant with 6 to 9 years of experience to work in Chennai office. As a Data Modelling Consultant, your role will involve providing end-to-end modeling support for OLTP and OLAP systems hosted on Google Cloud. Your responsibilities will include designing and validating conceptual, logical, and physical models for cloud databases, translating requirements into efficient schema designs, and supporting data model reviews, tuning, and implementation. You will also guide teams on best practices for schema evolution, indexing, and governance to enable usage of models in real-time applications and analytics platforms. To succeed in this role, you must have strong experience in modeling across OLTP and OLAP systems, hands-on experience with GCP tools like BigQuery, CloudSQL, and AlloyDB, and the ability to understand business rules and translate them into scalable structures. Additionally, familiarity with partitioning, sharding, materialized views, and query optimization is essential. Preferred skills for this role include experience with BFSI or financial domain data schemas, familiarity with modeling methodologies and standards such as 3NF and star schema. Soft skills like excellent stakeholder communication, collaboration, strategic thinking, and attention to scalability are also important. Joining this role will allow you to deliver advisory value across critical data initiatives, influence the modeling direction for a data-driven organization, and be at the forefront of GCP-based enterprise data transformation.,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Cloud Data Architect specializing in BigQuery and CloudSQL at our Chennai office, you will play a crucial role in leading the design and implementation of scalable, secure, and high-performing data architectures using Google Cloud technologies. Your expertise will be essential in shaping architectural direction and ensuring that data solutions meet enterprise-grade standards. Your responsibilities will include designing data architectures that align with performance, cost-efficiency, and scalability needs, implementing data models, security controls, and access policies across GCP platforms, leading cloud database selection, schema design, and tuning for analytical and transactional workloads, collaborating with DevOps and DataOps teams to deploy and manage data environments, ensuring best practices for data governance, cataloging, and versioning, and enabling real-time and batch integrations using GCP-native tools. To excel in this role, you must possess deep knowledge of BigQuery, CloudSQL, and the GCP data ecosystem, along with strong experience in schema design, partitioning, clustering, and materialized views. Hands-on experience in implementing data encryption, IAM policies, and VPC configurations is crucial, as well as an understanding of hybrid and multi-cloud data architecture strategies and data lifecycle management. Proficiency in GCP cost optimization is also required. Preferred skills for this role include experience with AlloyDB, Firebase, or Spanner, familiarity with LookML, dbt, or DAG-based orchestration tools, and exposure to the BFSI domain or financial services architecture. In addition to technical skills, soft skills such as visionary thinking with practical implementation skills, strong communication, and cross-functional leadership are highly valued. Previous experience guiding data strategy in enterprise settings will be advantageous. Joining our team will give you the opportunity to own data architecture initiatives in a cloud-native ecosystem, drive innovation through scalable and secure GCP designs, and collaborate with forward-thinking data and engineering teams. Skills required for this role include IAM policies, Spanner, cloud, schema design, data architecture, GCP data ecosystem, dbt, GCP cost optimization, data, AlloyDB, data encryption, data lifecycle management, BigQuery, LookML, VPC configurations, partitioning, clustering, materialized views, DAG-based orchestration tools, Firebase, and CloudSQL.,
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior T-SQL Developer with over 10 years of experience, you will be responsible for supporting mission-critical, transactional data systems for a global US-based bank. Your expertise in SQL Server performance optimization and database design will be crucial in maintaining scalable, secure, and compliant database environments in the banking and financial services industry. You will design and develop robust T-SQL objects such as stored procedures, functions, triggers, and complex queries, utilizing joins, CTEs, and window functions. Your role will involve analyzing and optimizing slow-running queries using execution plans, performance statistics, and indexing strategies. Participating in database schema design, applying normalization principles, defining relationships, and suggesting indexing for performance based on access patterns will also be part of your responsibilities. Your deep knowledge of transactions, locking, isolation levels, and deadlock resolution will be essential in implementing reliable and consistent transaction logic. You will utilize tools like SQL Server Management Studio (SSMS), DMVs, Activity Monitor, and Query Store for performance monitoring and troubleshooting. Additionally, you will conduct code reviews, enforce SQL coding standards, and collaborate with cross-functional teams to ensure the delivery of high-quality, secure SQL code. Your must-have skills include expert-level T-SQL programming, strong query optimization abilities, solid understanding of schema design and data modeling, and experience with SQL Server monitoring tools. Your proven track record of peer collaboration and delivering production-ready code in high-compliance environments, particularly in the banking/finance sector, will set you up for success in this role. While not mandatory, nice-to-have skills for this position include exposure to Azure SQL or Azure Synapse Analytics, familiarity with CI/CD for SQL, experience with SQL unit testing frameworks, exposure to monitoring tools like Redgate or SentryOne, basic knowledge of SSIS/SSRS/SSAS for ETL and reporting, and an understanding of data security and compliance practices such as SOX and GDPR. If you are a technically deep professional who thrives in a regulated, performance-sensitive ecosystem and can contribute independently to both development and optimization, this Senior T-SQL Developer role in Pune (Hybrid) may be the perfect fit for you.,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
haryana
On-site
The ideal candidate for this role will be responsible for building the entire Backend platform for a product portfolio and delivering new features end to end. You will be tasked with evolving the architecture to ensure performance and scalability while designing, developing, and owning components of a highly scalable, distributed web services platform. Your commitment to improving the software development process and team productivity will be key, as well as mentoring and training team members and leading module development independently. To be successful in this position, you should have a minimum of 5.5 years of experience in a scalable product/ecommerce organization, with excellent Java skills and a solid understanding of the Spring framework & MVC approach. A strong knowledge of performance optimization and caching techniques is essential, along with proficiency in Object-Oriented Programming concepts, data structures, and algorithms. Experience in developing scalable, fault-tolerant, distributed backend services, as well as familiarity with prevalent design patterns and advanced system designing, will be advantageous. Additionally, expertise in databases, schema design, particularly with NoSQL databases, and strong problem-solving skills are required to excel in this role.,
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough