Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
You will be responsible for designing, developing, and maintaining enterprise-grade search solutions using Apache Solr and SolrCloud. Your key tasks will include developing and optimizing search indexes and schema for various use cases such as product search, document search, or order/invoice search. Additionally, you will be required to integrate Solr with backend systems, databases, and APIs, implementing features like full-text search, faceted search, auto-suggestions, ranking, and relevancy tuning. It will also be part of your role to optimize search performance, indexing throughput, and query response time for efficient results. Your expertise in Apache Solr & SolrCloud, along with a strong understanding of Lucene, inverted index, analyzers, tokenizers, and search relevance tuning will be essential for this position. Proficiency in Java or Python for backend integration and development is required, as well as experience with RESTful APIs, data pipelines, and real-time indexing. Familiarity with Zookeeper, Docker, Kubernetes for SolrCloud deployments, and knowledge of JSON, XML, and schema design in Solr will also be necessary. Furthermore, your responsibilities will include ensuring data consistency and high availability using SolrCloud and Zookeeper for cluster coordination & configuration management. You will be expected to monitor the health of the search system and troubleshoot any issues that may arise in production. Collaboration with product teams, data engineers, and DevOps teams will be crucial for ensuring smooth delivery. Staying updated with new features of Apache Lucene/Solr and recommending improvements will also be part of your role. Preferred qualifications for this position include a Bachelors or Masters degree in Computer Science, Engineering, or a related field. Experience with Elasticsearch or other search technologies will be advantageous, as well as working knowledge of CI/CD pipelines and cloud platforms such as Azure. Overall, your role will involve working on search solutions, optimizing performance, ensuring data consistency, and collaborating with cross-functional teams for successful project delivery.,
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
Join us as a Senior Developer at Barclays, where you will spearhead the evolution of the digital landscape, driving innovation and excellence. You will utilize cutting-edge technology to revolutionize digital offerings, ensuring unparalleled customer experiences. You will be assessed on critical skills relevant for success in the role, including experience with skills to meet business requirements and job-specific skillsets. To be successful as a Senior Developer, you should have experience with: Basic/ Essential Qualifications: - Graduate/Postgraduate with hands-on experience (8+ years) in Microsoft.Net, C#, ASP.Net MVC, Web Services, Web API, DotNet Core, JavaScript, jQuery, HTML5, CSS3. - Previous experience in RestFul services is a big plus. - Good knowledge and understanding of pricing different products. - Experience with relational database systems, schema design, SSIS, SQL (MSSQL), and stored procedures. - Strong general development practices such as OOAD, design patterns, continuous integration, unit testing, and Agile Process. - Structured approach to problem-solving and ability to manage parallel streams of work. - Experience with technologies supporting development, continuous integration, automated testing, and deployment. - Ability to mentor and guide junior team members. Desirable skillsets/ good to have: - Knowledge and experience with OpenShift and other cloud-based solutions is a plus. - UI framework expertise. This role will be based out of Pune. Purpose of the role: To design, develop, and improve software using various engineering methodologies that provide business, platform, and technology capabilities for customers and colleagues. Accountabilities: - Development and delivery of high-quality software solutions using industry-aligned programming languages, frameworks, and tools. - Collaborating with product managers, designers, and engineers to define software requirements, devise solution strategies, and ensure seamless integration with business objectives. - Participation in code reviews and promotion of a culture of code quality and knowledge sharing. - Staying informed of industry technology trends, contributing to the organization's technology communities, and fostering a culture of technical excellence and growth. - Adherence to secure coding practices and implementation of effective unit testing practices. Expectations for Assistant Vice President: - Advising and influencing decision-making, contributing to policy development, and ensuring operational effectiveness. - Leading a team to deliver work impacting the whole business function. - Setting objectives, coaching employees, and appraising performance relative to objectives. - Demonstrating leadership behaviours to create an environment for colleagues to thrive and deliver to an excellent standard. All colleagues are expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as the Barclays Mindset of Empower, Challenge, and Drive.,
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
As a Tech Lead, Data Architecture at Fiserv, you will play a crucial role in our data warehousing strategy and implementation. Your responsibilities will include designing, developing, and leading the adoption of Snowflake-based solutions to ensure efficient and secure data systems that drive our business analytics and decision-making processes. Collaborating with cross-functional teams, you will define and implement best practices for data modeling, schema design, and query optimization in Snowflake. Additionally, you will develop and manage ETL/ELT workflows to ingest, transform, and load data from various resources into Snowflake, integrating data from diverse systems like databases, APIs, flat files, and cloud storage. Monitoring and tuning Snowflake performance, you will manage caching, clustering, and partitioning to enhance efficiency while analyzing and resolving query performance bottlenecks. You will work closely with data analysts, data engineers, and business users to understand reporting and analytic needs, ensuring seamless integration with BI Tools like Power BI. Your role will also involve collaborating with the DevOps team for automation, deployment, and monitoring, as well as planning and executing strategies for scaling Snowflake environments as data volume grows. Keeping up to date with emerging trends and technologies in data warehousing and data management is essential, along with providing technical support, troubleshooting, and guidance to users accessing the data warehouse. To be successful in this role, you must have 8 to 10 years of experience in data management tools like Snowflake, Streamsets, and Informatica. Experience with monitoring tools like Dynatrace and Splunk, Kubernetes cluster management, and Linux OS is required. Additionally, familiarity with containerization technologies, cloud services, CI/CD pipelines, and banking or financial services experience would be advantageous. Thank you for considering employment with Fiserv. To apply, please use your legal name, complete the step-by-step profile, and attach your resume. Fiserv is committed to diversity and inclusion and does not accept resume submissions from agencies outside of existing agreements. Beware of fraudulent job postings not affiliated with Fiserv to protect your personal information and financial security.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
As a software developer in our team, you will play a crucial role in understanding customer needs by collaborating with product managers and business stakeholders. Your responsibilities will include designing, developing, delivering, and supporting large-scale and distributed software applications and tools in an agile, startup-like environment. You will be expected to prepare necessary documents such as flowcharts and workflows to identify requirements and solutions, as well as take initiatives to invent new solutions for our customers. Building the entire back-end platform for a product portfolio will be a key part of your role, along with designing and leading backend architecture implementation in an innovative environment. Ownership of features across the entire life cycle, from inception to deployment in production, will be essential. You will be encouraged to pick up new technologies and frameworks that best suit the needs of our products and users, while using software engineering best practices to ensure high standards of quality and maintainability for all deliverables. Your technical competencies should include proficiency in Go-Lang skills, understanding of Kubernetes and OCP, familiarity with CI/CD pipelines, Spring framework, MVC approach, performance optimization, caching techniques, databases, schema design, object-oriented programming concepts, data structures, and algorithms. Moreover, your leadership competencies should encompass customer obsession, collaboration, influence, ownership mindset, learning agility, navigating change, leaders building leaders, and execution excellence.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
You have a total of 4-6 years of development/design experience with a minimum of 3 years experience in Big Data technologies on-prem and on cloud. You should be proficient in Snowflake and possess strong SQL programming skills. Your role will require strong experience with data modeling and schema design, as well as extensive experience in using Data warehousing tools like Snowflake/BigQuery/RedShift and BI Tools like Tableau/QuickSight/PowerBI (at least one must be a must-have). You must also have experience with orchestration tools like Airflow and transformation tool DBT. Your responsibilities will include implementing ETL/ELT processes and building data pipelines, workflow management, job scheduling, and monitoring. You should have a good understanding of Data Governance, Security and Compliance, Data Quality, Metadata Management, Master Data Management, Data Catalog, as well as cloud services (AWS), including IAM and log analytics. Excellent interpersonal and teamwork skills are essential, along with the experience of leading and mentoring other team members. Good knowledge of Agile Scrum and communication skills are also required. At GlobalLogic, the culture prioritizes caring and inclusivity. Youll join an environment where people come first, fostering meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Continuous learning and development opportunities are provided to help you grow personally and professionally. Meaningful work awaits you at GlobalLogic, where youll have the chance to work on impactful projects and engage your curiosity and problem-solving skills. The organization values balance and flexibility, offering various career areas, roles, and work arrangements to help you achieve a perfect balance between work and life. GlobalLogic is a high-trust organization where integrity is key, ensuring a safe, reliable, and ethical global environment for all employees. Truthfulness, candor, and integrity are fundamental values upheld in everything GlobalLogic does. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner that collaborates with the world's largest and most forward-thinking companies. Leading the digital revolution since 2000, GlobalLogic helps create innovative digital products and experiences, transforming businesses and redefining industries through intelligent products, platforms, and services.,
Posted 2 months ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
You are looking for a Data Modelling Consultant with 6 to 9 years of experience to work in Chennai office. As a Data Modelling Consultant, your role will involve providing end-to-end modeling support for OLTP and OLAP systems hosted on Google Cloud. Your responsibilities will include designing and validating conceptual, logical, and physical models for cloud databases, translating requirements into efficient schema designs, and supporting data model reviews, tuning, and implementation. You will also guide teams on best practices for schema evolution, indexing, and governance to enable usage of models in real-time applications and analytics platforms. To succeed in this role, you must have strong experience in modeling across OLTP and OLAP systems, hands-on experience with GCP tools like BigQuery, CloudSQL, and AlloyDB, and the ability to understand business rules and translate them into scalable structures. Additionally, familiarity with partitioning, sharding, materialized views, and query optimization is essential. Preferred skills for this role include experience with BFSI or financial domain data schemas, familiarity with modeling methodologies and standards such as 3NF and star schema. Soft skills like excellent stakeholder communication, collaboration, strategic thinking, and attention to scalability are also important. Joining this role will allow you to deliver advisory value across critical data initiatives, influence the modeling direction for a data-driven organization, and be at the forefront of GCP-based enterprise data transformation.,
Posted 2 months ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Cloud Data Architect specializing in BigQuery and CloudSQL at our Chennai office, you will play a crucial role in leading the design and implementation of scalable, secure, and high-performing data architectures using Google Cloud technologies. Your expertise will be essential in shaping architectural direction and ensuring that data solutions meet enterprise-grade standards. Your responsibilities will include designing data architectures that align with performance, cost-efficiency, and scalability needs, implementing data models, security controls, and access policies across GCP platforms, leading cloud database selection, schema design, and tuning for analytical and transactional workloads, collaborating with DevOps and DataOps teams to deploy and manage data environments, ensuring best practices for data governance, cataloging, and versioning, and enabling real-time and batch integrations using GCP-native tools. To excel in this role, you must possess deep knowledge of BigQuery, CloudSQL, and the GCP data ecosystem, along with strong experience in schema design, partitioning, clustering, and materialized views. Hands-on experience in implementing data encryption, IAM policies, and VPC configurations is crucial, as well as an understanding of hybrid and multi-cloud data architecture strategies and data lifecycle management. Proficiency in GCP cost optimization is also required. Preferred skills for this role include experience with AlloyDB, Firebase, or Spanner, familiarity with LookML, dbt, or DAG-based orchestration tools, and exposure to the BFSI domain or financial services architecture. In addition to technical skills, soft skills such as visionary thinking with practical implementation skills, strong communication, and cross-functional leadership are highly valued. Previous experience guiding data strategy in enterprise settings will be advantageous. Joining our team will give you the opportunity to own data architecture initiatives in a cloud-native ecosystem, drive innovation through scalable and secure GCP designs, and collaborate with forward-thinking data and engineering teams. Skills required for this role include IAM policies, Spanner, cloud, schema design, data architecture, GCP data ecosystem, dbt, GCP cost optimization, data, AlloyDB, data encryption, data lifecycle management, BigQuery, LookML, VPC configurations, partitioning, clustering, materialized views, DAG-based orchestration tools, Firebase, and CloudSQL.,
Posted 2 months ago
10.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior T-SQL Developer with over 10 years of experience, you will be responsible for supporting mission-critical, transactional data systems for a global US-based bank. Your expertise in SQL Server performance optimization and database design will be crucial in maintaining scalable, secure, and compliant database environments in the banking and financial services industry. You will design and develop robust T-SQL objects such as stored procedures, functions, triggers, and complex queries, utilizing joins, CTEs, and window functions. Your role will involve analyzing and optimizing slow-running queries using execution plans, performance statistics, and indexing strategies. Participating in database schema design, applying normalization principles, defining relationships, and suggesting indexing for performance based on access patterns will also be part of your responsibilities. Your deep knowledge of transactions, locking, isolation levels, and deadlock resolution will be essential in implementing reliable and consistent transaction logic. You will utilize tools like SQL Server Management Studio (SSMS), DMVs, Activity Monitor, and Query Store for performance monitoring and troubleshooting. Additionally, you will conduct code reviews, enforce SQL coding standards, and collaborate with cross-functional teams to ensure the delivery of high-quality, secure SQL code. Your must-have skills include expert-level T-SQL programming, strong query optimization abilities, solid understanding of schema design and data modeling, and experience with SQL Server monitoring tools. Your proven track record of peer collaboration and delivering production-ready code in high-compliance environments, particularly in the banking/finance sector, will set you up for success in this role. While not mandatory, nice-to-have skills for this position include exposure to Azure SQL or Azure Synapse Analytics, familiarity with CI/CD for SQL, experience with SQL unit testing frameworks, exposure to monitoring tools like Redgate or SentryOne, basic knowledge of SSIS/SSRS/SSAS for ETL and reporting, and an understanding of data security and compliance practices such as SOX and GDPR. If you are a technically deep professional who thrives in a regulated, performance-sensitive ecosystem and can contribute independently to both development and optimization, this Senior T-SQL Developer role in Pune (Hybrid) may be the perfect fit for you.,
Posted 2 months ago
6.0 - 10.0 years
0 Lacs
haryana
On-site
The ideal candidate for this role will be responsible for building the entire Backend platform for a product portfolio and delivering new features end to end. You will be tasked with evolving the architecture to ensure performance and scalability while designing, developing, and owning components of a highly scalable, distributed web services platform. Your commitment to improving the software development process and team productivity will be key, as well as mentoring and training team members and leading module development independently. To be successful in this position, you should have a minimum of 5.5 years of experience in a scalable product/ecommerce organization, with excellent Java skills and a solid understanding of the Spring framework & MVC approach. A strong knowledge of performance optimization and caching techniques is essential, along with proficiency in Object-Oriented Programming concepts, data structures, and algorithms. Experience in developing scalable, fault-tolerant, distributed backend services, as well as familiarity with prevalent design patterns and advanced system designing, will be advantageous. Additionally, expertise in databases, schema design, particularly with NoSQL databases, and strong problem-solving skills are required to excel in this role.,
Posted 2 months ago
0.0 - 4.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Engineer, your primary role will involve designing, building, and maintaining data pipelines and infrastructure to support data-driven initiatives. You will be responsible for ensuring that data is collected, stored, and processed efficiently to enable analysis and business use. Your key responsibilities will include designing, implementing, and optimizing end-to-end data pipelines for ingesting, processing, and transforming large volumes of structured and unstructured data. Additionally, you will be expected to build and maintain data infrastructure to enable organizations to effectively leverage data. You will also play a crucial role in designing and maintaining data models, schemas, and database structures to support analytical and operational use cases. Ensuring data quality, accuracy, and security throughout the data lifecycle will be a key focus area in your role. Collaboration with data scientists, analysts, and other stakeholders will be essential to understand data requirements and deliver effective solutions. Problem-solving skills will be crucial as you identify and address data-related challenges to ensure that data is readily available for analysis and decision-making. To excel in this role, you will need to stay up-to-date with the latest data engineering technologies and tools. This will enable you to leverage the most effective solutions for data pipeline design, data infrastructure maintenance, and data modeling. This is a full-time, permanent position suitable for fresher candidates. The work schedule will be during the day shift and morning shift. Performance bonuses will be provided, and the work location will be in person. Benefits include food provided during work hours, enhancing your overall work experience and ensuring your well-being at the workplace.,
Posted 2 months ago
2.0 - 6.0 years
0 Lacs
kannur, kerala
On-site
As a Senior PHP Laravel Developer at our company, you will be an essential member of our team responsible for developing and maintaining our cutting-edge SAAS-based product. Your role will involve leveraging your expertise in PHP Laravel framework to build scalable, secure, and efficient applications on AWS. Your key responsibilities will include: - Developing and maintaining our SAAS-based product using PHP Laravel framework. - Designing and implementing scalable and secure applications on AWS for optimal performance and reliability. - Creating and managing automated testing procedures to ensure the quality and functionality of the application. - Applying best practices in code development to maintain clean, efficient, and maintainable code. - Collaborating with cross-functional teams to analyze requirements, design solutions, and troubleshoot any issues. - Keeping up-to-date with emerging trends and technologies in development, AWS services, and security practices. To be successful in this role, you should have: - A Bachelor's degree in Computer Science, Information Technology, or a related field. - At least 2 years of experience in development with a focus on Laravel framework. - Hands-on experience in schema design, query optimization, and REST API. - Profound knowledge of AWS services such as EC2, S3, RDS, Lambda, and Redis. - Demonstrated experience in designing scalable and secure web applications. - Expertise in automated testing frameworks. - Strong understanding of web security practices and their implementation in Laravel applications. - Proficiency in code versioning tools like Git. - Excellent problem-solving skills and the ability to thrive in a fast-paced environment. - Strong communication, teamwork, and documentation skills. In return, we offer: - Competitive salary and benefits package. - A dynamic and supportive work environment. - Opportunities for professional growth and development. - The chance to work on innovative products with a talented team. If you are passionate about developing high-quality, secure, and scalable applications and seek a challenging opportunity for professional growth, we encourage you to apply. Please send your resume and a cover letter to our HR department. We are an equal-opportunity employer that values diversity and does not discriminate based on disability status. This is a full-time position based in person.,
Posted 2 months ago
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
The Company Our beliefs are the foundation for how you conduct business every day. You live each day guided by our core values of Inclusion, Innovation, Collaboration, and Wellness. Together, our values ensure that you work together as one global team with our customers at the center of everything you do and they push you to ensure you take care of yourselves, each other, and our communities. Job Description Summary: What you need to know about the role: A Business Systems Analyst passionate about delivering quality deliverables in a fast-paced environment with an undivided customer focus. Meet our team: The Finance Technology team consists of a diverse group of well-talented, driven, hive-minded subject matter experts that relentlessly work towards enabling the best-in-class solutions for our customers to transform current state solutions. You will work with this team to set up finance solutions, explore avenues to automate, challenge the status quo, and simplify the current state through transformation. Job Description: Your way to impact Your day to day: - Build scalable systems by leading discussions with the business, understanding the requirements from both Customer and Business, and delivering requirements to the engineering team to guide them in building a robust, scalable solution. - Have hands-on technical experience to support across multiple platforms (GCP, Python, Hadoop, SAP, Teradata, Machine Learning). - Establish a consistent project management framework and develop processes to deliver high-quality software in rapid iterations for business partners in multiple geographies. - Participate in a team that designs, develops, troubleshoots, and debugs software programs for databases, applications, tools, etc. - Experience in balancing production platform stability, feature delivery, and the reduction of technical debt across a broad landscape of technologies. What Do You Need To Bring: - You have consistently high standards, and your passion for quality is inherent in everything you do. - Experience with GCP BQ, SQL, data flow. - 4+ years of relevant experience. - Data warehouses, Data marts, distributed data platforms, and data lakes. - Data Modeling, Schema design. - Reporting/Visualization Looker, Tableau, Power BI. - Knowledge of Statistical and machine learning models. - Excellent structured thinking skills, with the ability to break down multi-dimensional problems. - Ability to navigate ambiguity and work in a fast-moving environment with multiple stakeholders. - We know the confidence gap and imposter syndrome can get in the way of meeting spectacular candidates. Please don't hesitate to apply. Our Benefits: Who We Are: To learn more about our culture and community, visit https://about.pypl.com/who-we-are/default.aspx Commitment to Diversity and Inclusion Any general requests for consideration of your skills, please Join our Talent Community. We know the confidence gap and imposter syndrome can get in the way of meeting spectacular candidates. Please don't hesitate to apply. REQ ID R0115599,
Posted 2 months ago
3.0 - 5.0 years
10 - 12 Lacs
Bengaluru
Hybrid
Notice Period: Immediate Key Responsibilities: Design and implement data models and schemas to support business intelligence and analytics. Perform data engineering tasks as needed to support analytical activities. Develop clear, concise, and insightful data visualizations and dashboards. Interpret complex datasets and communicate findings through compelling visualizations and storytelling. Work closely with stakeholders to understand data requirements and deliver actionable insights. Maintain documentation and ensure data quality and integrity. Reporting Structure: Direct reporting to Senior Data Engineer Dotted-line reporting to CTO Required Skills & Qualifications: Proficiency in Power BI, Microsoft Fabric, and Power Query. Experience designing and implementing data models and schemas. Familiarity with basic data engineering tasks. Advanced SQL skills for querying and analyzing data. Exceptional ability to translate complex data into clear, actionable insights. Strong ability to communicate complex data insights effectively to technical and non-technical audiences. Preferred Skills: Experience with Python for data manipulation and analysis. Experience in the finance, tax, or professional services industries. Familiarity with Salesforce data models and integrations.
Posted 2 months ago
4.0 - 9.0 years
9 - 15 Lacs
Pune
Remote
Job Summary: We are hiring a Senior Backend Developer to design, develop, and maintain scalable backend systems that power our innovative project management software. This role involves: Developing high-performance backend logic using Node.js, Express, and MongoDB Building secure and efficient APIs Collaborating with frontend, DevOps, and architecture teams Applying deep knowledge of databases, infrastructure, and system-level thinking Job Responsibilities: Design and implement robust backend systems using JavaScript (Node.js) Work extensively with Express framework and MongoDB for data operations Use MongoDB aggregation pipelines for complex querying Write clean, scalable, and testable code Collaborate with frontend and DevOps teams to ensure seamless integration Support deployment, debugging, and monitoring in production environments Maintain documentation and conduct code reviews Qualifications : B.Tech/BE in Computer Science or related field from a reputed institute Experience: 4+ years in JavaScript development (Especially Backend side) Experience of at least 3 years in Node, Express and MongoDB Knowledge of DBMS, MongoDB pipelines and system level architecture Skills Required: Excellent analytical and problem-solving skills Strong command over backend design principles and security standards Familiarity with database indexing, optimization, and schema design
Posted 2 months ago
1.0 - 4.0 years
3 - 4 Lacs
Bengaluru
Work from Office
About Teachmint: At Teachmint, we believe that education moves the world forward and deserves the best technology in this pursuit. We are a global classroom technology company empowering educators and institutions in over 50 countries. At the forefront of classroom innovation, Teachmint is transforming how education is delivered through its proprietary solutionsTeachmint X, an AI-powered digital board; EduAI, an intelligent AI companion that empowers educators and learners to become self reliant; and our interactive whiteboard technology, designed to blend intelligence with usability and elevate every moment of classroom interaction. We are redefining education infrastructure. Whether you're architecting backend systems, designing intuitive front-end experiences, improving deployment pipelines, driving business growth and brand visibility, or scaling user impactyour work here directly shapes the future of education. If youre excited by the idea of building smart, scalable, and meaningful solutions in education, come create with us. Learn more: www.teachmint.com Role: Operations Associate Team: Teachmint X OMC(Order Management Cycle) Job Summary: Teachmint is looking for a highly motivated and curious individual to be part of the Analytics team and build the foundation of intelligence that drives the business forward. The ideal candidate will be passionate about leveraging data to drive strategic business decisions and improve operational efficiency. As a Business Analyst, you will play a pivotal role in analyzing data, creating insightful dashboards, writing SQL queries, and presenting actionable recommendations to key stakeholders. Additionally, you will contribute to the design of database schemas to support our growing platform. Key responsibilities: Work closely with Business Managers/Heads to understand and solve business problems through data-driven decisions. Involved in development and maintenance of backend data and queries which will be used further for dashboarding and visualizations. Present insights and recommendations to leadership using high-quality visualizations and concise messaging. Administrative Tasks: Handling paperwork, emails, phone calls, and scheduling appointments Data Entry and Management: Inputting data, maintaining records, and generating reports Partner Support: Assisting partners with inquiries, resolving issues, and delivering excellent service Order Management: Managing business tools and overseeing end-to-end order processing Preferred Experience: 1+ years hands-on experience in the analytics domain. Strong experience in Google Sheets, Slides, and Forms Proactive and problem-solving. Excellent communication and presentation skills with the ability to convey technical concepts to non-technical stakeholders. Prior experience in schema design and data modeling is a plus. Must-have mindsets and skillsets: Ability to translate structured and unstructured problems into an analytical framework. Comfortable in a fast-paced start-up environment, learn on the job and get things done. Willingness to lead on projects independently.
Posted 3 months ago
7.0 - 12.0 years
9 - 14 Lacs
Mumbai
Work from Office
We are seeking a highly skilled Senior Snowflake Developer with expertise in Python, SQL, and ETL tools to join our dynamic team. The ideal candidate will have a proven track record of designing and implementing robust data solutions on the Snowflake platform, along with strong programming skills and experience with ETL processes. Key Responsibilities: Designing and developing scalable data solutions on the Snowflake platform to support business needs and analytics requirements. Leading the end-to-end development lifecycle of data pipelines, including data ingestion, transformation, and loading processes. Writing efficient SQL queries and stored procedures to perform complex data manipulations and transformations within Snowflake. Implementing automation scripts and tools using Python to streamline data workflows and improve efficiency. Collaborating with cross-functional teams to gather requirements, design data models, and deliver high-quality solutions. Performance tuning and optimization of Snowflake databases and queries to ensure optimal performance and scalability. Implementing best practices for data governance, security, and compliance within Snowflake environments. Mentoring junior team members and providing technical guidance and support as needed. Qualifications: Bachelor's degree in Computer Science, Engineering, or related field. 7+ years of experience working with Snowflake data warehouse. Strong proficiency in SQL with the ability to write complex queries and optimize performance. Extensive experience developing data pipelines and ETL processes using Python and ETL tools such as Apache Airflow, Informatica, or Talend. Strong Python coding experience needed minimum 2 yrs Solid understanding of data warehousing concepts, data modeling, and schema design. Experience working with cloud platforms such as AWS, Azure, or GCP. Excellent problem-solving and analytical skills with a keen attention to detail. Strong communication and collaboration skills with the ability to work effectively in a team environment. Any relevant certifications in Snowflake or related technologies would be a plus
Posted 3 months ago
1.0 - 3.0 years
3 - 6 Lacs
Hyderabad
Work from Office
What you will do In this vital role you will as a MuleSoft developer be a member of the MyAmgen Product Team, responsible for crafting efficient, scalable, and high-quality integration solutions by applying Mulesoft development standard processes. This role involves designing and implementing modular solutions that deliver incremental business value, optimizing existing code, and working closely with the development lead and architect to ensure alignment with the architecture runway and strategic goals. The developer collaborates with multi-functional teams to ensure solutions meet business requirements, adhere to established development guidelines, and maintain consistency throughout the lifecycle. Additionally, this role requires performing tasks within Agile sprint cycles, meeting definitions of done, and delivering planned value. The developer is also responsible for testing, validating, and refining both individual and team contributions to ensure the quality and functionality of delivered solutions. This position calls for a proactive problem solver with deep technical expertise, a collaborative attitude, and a focus on delivering impactful results. Roles & Responsibilities: Apply Mulesoft development standard processes to build efficient, scalable, and high-quality integration solutions. Follow development guidelines and standards established by the dev lead, product team, and platform teams to ensure consistency, compliance, and alignment. Design and implement modular solutions that deliver incremental business value while adhering to overall project goals and architectural principles. Improve and optimize existing code, collaborating with the dev lead and architect to align solutions with the architecture runway and strategic objectives. Ensure solutions meet business requirements by effectively communicating and collaborating with team members throughout the development process. Implement tasks aligned with definitions of done and deliver planned value during each sprint, adhering to Agile methodologies. Test, validate, and refine individual contributions and team work to maintain the quality and functionality of integration solutions. Create and update documentation accurately describing software functionality What we expect of you We are all different, yet we all use our unique contributions to serve patients. The professional we seek is a type of person with these qualifications. Basic Qualifications: Masters degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelors degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Functional Skills: Must-Have Skills : Proficiency in Mulesoft development standard processes to create efficient, scalable, and high-quality integration solutions Experience using Java programming within Mulesoft Experience integrating Mulesoft solutions with external platforms and data sources such as Salesforce, AWS or Azure Experience in utilizing source control systems such as GIT to manage Mulesoft code and configuration Good-to-Have Skills: Experience in testing, validating, and refining deliverables to maintain the quality and functionality of integration solutions Expertise in designing and implementing modular integration solutions that deliver incremental business value and align with overall project goals and architectural principles Experience in optimizing and improving existing code, collaborating with the dev lead and architect to align with the architecture runway and strategic objectives Understanding data models, schema design, and relationships for efficient application functionality in integration solution. Proficiency in modern software engineering practices, such as infrastructure-as-code, automated testing, and modular design principles Skills in analyzing data and creating visualizations to monitor execution metrics and integration performance Knowledge of Agile methodologies, including the ability to execute tasks aligned with definitions of done and deliver planned value in each sprint Professional Certifications: Mulesoft Certified Developer Soft Skills: Strong critical-thinking and problem-solving skills Ability to communicate and collaborate effectively Proven awareness of how to function in a team setting
Posted 3 months ago
5.0 - 10.0 years
12 - 22 Lacs
Hyderabad, Delhi / NCR
Hybrid
8+ years of experience in data engineering or a related field. Strong expertise in Snowflake including schema design, performance tuning, and security. Proficiency in Python for data manipulation and automation. Solid understanding of data modeling concepts (star/snowflake schema, normalization, etc.). Experience with DBT for data transformation and documentation. Hands-on experience with ETL/ELT tools and orchestration frameworks (e.g., Airflow, Prefect). Strong SQL skills and experience with large-scale data sets. Familiarity with cloud platforms (AWS, Azure, or GCP) and data services.
Posted 3 months ago
3.0 - 6.0 years
12 - 15 Lacs
Bengaluru
Work from Office
Job Summary: We seek a skilled Backend Developer with hands-on experience in Java (Spring Boot), Golang, and microservices architecture. The ideal candidate will be responsible for developing scalable backend microservices, integrating with frontend applications, managing APIs, and contributing to system performance and deployment processes. Key Responsibilities: Design and develop scalable microservices using Java with Spring Boot. Develop and maintain backend services using Golang where appropriate. Integrate backend services with a Vue. js-based dashboard via RESTful APIs. Design, implement, and document new APIs, and consume existing APIs to support frontend and service-layer integration. Manage service routing and security via API gateways such as NGINX or Kong. Synchronize client-server interactions and optimize communication patterns. Handle database management, including schema design, optimization, and query performance. Create Docker images for services and contribute to container orchestration workflows. Optimize backend performance, conduct benchmarking, and perform rigorous testing to ensure reliability. Collaborate with DevOps for CI/CD pipelines and deployment automation. Work extensively on Linux-based systems and environments for development and production. Required Skills and Qualifications: 3+ years of professional experience in backend development. Strong proficiency in Java (Spring Boot) and experience with Golang. Solid understanding of microservices architecture and design patterns. Experience in building and consuming RESTful APIs. Knowledge of frontend-backend integration using Vue.js or similar frameworks. Hands-on experience with NGINX or Kong as an API Gateway. Experience with Docker, image creation, and basic container orchestration. Proficiency in working with relational or NoSQL databases. Good understanding of performance optimization and testing best practices. Strong working knowledge of Linux environments (Ubuntu, CentOS, Alpine, etc.) Excellent problem-solving skills and ability to work collaboratively in a team.
Posted 3 months ago
7 - 9 years
10 - 15 Lacs
Mumbai
Work from Office
We are seeking a highly skilled Senior Snowflake Developer with expertise in Python, SQL, and ETL tools to join our dynamic team. The ideal candidate will have a proven track record of designing and implementing robust data solutions on the Snowflake platform, along with strong programming skills and experience with ETL processes. Key Responsibilities: Designing and developing scalable data solutions on the Snowflake platform to support business needs and analytics requirements. Leading the end-to-end development lifecycle of data pipelines, including data ingestion, transformation, and loading processes. Writing efficient SQL queries and stored procedures to perform complex data manipulations and transformations within Snowflake. Implementing automation scripts and tools using Python to streamline data workflows and improve efficiency. Collaborating with cross-functional teams to gather requirements, design data models, and deliver high-quality solutions. Performance tuning and optimization of Snowflake databases and queries to ensure optimal performance and scalability. Implementing best practices for data governance, security, and compliance within Snowflake environments. Mentoring junior team members and providing technical guidance and support as needed. Qualifications: Bachelor's degree in Computer Science, Engineering, or related field. 7+ years of experience working with Snowflake data warehouse. Strong proficiency in SQL with the ability to write complex queries and optimize performance. Extensive experience developing data pipelines and ETL processes using Python and ETL tools such as Apache Airflow, Informatica, or Talend. Strong Python coding experience needed minimum 2 yrs Solid understanding of data warehousing concepts, data modeling, and schema design. Experience working with cloud platforms such as AWS, Azure, or GCP. Excellent problem-solving and analytical skills with a keen attention to detail. Strong communication and collaboration skills with the ability to work effectively in a team environment. Any relevant certifications in Snowflake or related technologies would be a plus.
Posted 4 months ago
8.0 - 10.0 years
6 - 10 Lacs
indore, hyderabad, ahmedabad
Hybrid
Role: Senior Data Engineer Job Location: Hyderabad, Indore, Ahmedabad (India) Notice Period: Immediate joiners or within 15 days preferred Share Your Resume With: Current CTC Expected CTC Notice Period Preferred Job Location Primary Skills: MSSQL, Redshift, Snowflake T-SQL, LinkSQL, Stored Procedures ETL Pipeline Development Query Optimization & Indexing Schema Design & Partitioning Data Quality, SLAs, Data Refresh Source Control (Git/Bitbucket), CI/CD Data Modeling, Versioning Performance Tuning & Troubleshooting What You Will Do: Design scalable, partitioned schemas for MSSQL, Redshift, and Snowflake. Optimize complex queries, stored procedures, indexing, and performance tuning. Build and maintain robust data pipelines to ensure timely, reliable delivery of data. Own SLAs for data refreshes, ensuring reliability and consistency. Collaborate with engineers, analysts, and DevOps to align data models with product and business needs. Troubleshoot performance issues, implement proactive monitoring, and improve workflows. Enforce best practices for data security, governance, and compliance. Utilize schema migration/versioning tools for database changes. What Youll Bring: Bachelors or Masters in Computer Science, Engineering, or related field. 8+ years of experience in database engineering or backend data systems. Expertise in MySQL, Redshift, Snowflake, and schema optimization. Strong experience in writing functions, procedures, and robust SQL scripts. Proficiency with ETL processes, data modeling, and data freshness SLAs. Experience handling production performance issues and being the go-to database expert. Hands-on with Git, CI/CD pipelines, and data observability tools. Strong problem-solving, collaboration, and analytical skills. If youre interested and meet the above criteria, please share your resume with your current CTC, expected CTC, notice period, and preferred job location. Immediate or 15-day joiners will be prioritized.
Posted Date not available
8.0 - 10.0 years
9 - 13 Lacs
hyderabad, bengaluru, delhi / ncr
Work from Office
Key Responsibilities Design and develop ETL/ELT workflows using Oracle Data Integrator (ODI). Build stored procedures and data warehouse schemas (fact & dimension tables). Collaborate with DBAs for SQL script execution and schema deployment. Develop Python scripts to automate API-based data ingestion. Analyze data sources and align models with BRD. Implement data validation, profiling, and data quality monitoring practices. Apply best practices for metadata management, query optimization, and performance tuning. Work closely with stakeholders to communicate technical solutions effectively. Required Skills 5+ years of hands-on experience with ODI Strong in SQL, data modeling, ETL/ELT concepts Experience with data warehousing and schema design Proficiency in Python for API integrations Excellent communication, problem-solving, and analytical abilities How to Apply Send your updated resume to: [Insert Email] Please include: Current CTC Expected CTC Notice Period Current Location Confirmation of Night Shift & Remote Work Preference Location: Remote- Bengaluru,Hyderabad,Delhi / NCR,Chennai,Pune,Kolkata,Ahmedabad,Mumbai
Posted Date not available
8.0 - 12.0 years
30 - 35 Lacs
indore, hyderabad, ahmedabad
Work from Office
Primary Skills: MSSQL, Redshift, Snowflake T-SQL, LinkSQL, Stored Procedures ETL Pipeline Development Query Optimization & Indexing Schema Design & Partitioning Data Quality, SLAs, Data Refresh Source Control (Git/Bitbucket), CI/CD Data Modeling, Versioning Performance Tuning & Troubleshooting What You Will Do: Design scalable, partitioned schemas for MSSQL, Redshift, and Snowflake. Optimize complex queries, stored procedures, indexing, and performance tuning. Build and maintain robust data pipelines to ensure timely, reliable delivery of data. Own SLAs for data refreshes, ensuring reliability and consistency. Collaborate with engineers, analysts, and DevOps to align data models with product and business needs. Troubleshoot performance issues, implement proactive monitoring, and improve workflows. Enforce best practices for data security, governance, and compliance. Utilize schema migration/versioning tools for database changes. What Youll Bring: Bachelors or Masters in Computer Science, Engineering, or related field. 8+ years of experience in database engineering or backend data systems. Expertise in MySQL, Redshift, Snowflake, and schema optimization. Strong experience in writing functions, procedures, and robust SQL scripts. Proficiency with ETL processes, data modeling, and data freshness SLAs. Experience handling production performance issues and being the go-to database expert. Hands-on with Git, CI/CD pipelines, and data observability tools. Strong problem-solving, collaboration, and analytical skills.
Posted Date not available
7.0 - 12.0 years
9 - 14 Lacs
mumbai
Work from Office
We are seeking a highly skilled Senior Snowflake Developer with expertise in Python, SQL, and ETL tools to join our dynamic team. The ideal candidate will have a proven track record of designing and implementing robust data solutions on the Snowflake platform, along with strong programming skills and experience with ETL processes. Key Responsibilities: Designing and developing scalable data solutions on the Snowflake platform to support business needs and analytics requirements. Leading the end-to-end development lifecycle of data pipelines, including data ingestion, transformation, and loading processes. Writing efficient SQL queries and stored procedures to perform complex data manipulations and transformations within Snowflake. Implementing automation scripts and tools using Python to streamline data workflows and improve efficiency. Collaborating with cross-functional teams to gather requirements, design data models, and deliver high-quality solutions. Performance tuning and optimization of Snowflake databases and queries to ensure optimal performance and scalability. Implementing best practices for data governance, security, and compliance within Snowflake environments. Mentoring junior team members and providing technical guidance and support as needed. Qualifications: Bachelor's degree in Computer Science, Engineering, or related field. 7+ years of experience working with Snowflake data warehouse. Strong proficiency in SQL with the ability to write complex queries and optimize performance. Extensive experience developing data pipelines and ETL processes using Python and ETL tools such as Apache Airflow, Informatica, or Talend. Strong Python coding experience needed minimum 2 yrs Solid understanding of data warehousing concepts, data modeling, and schema design. Experience working with cloud platforms such as AWS, Azure, or GCP. Excellent problem-solving and analytical skills with a keen attention to detail. Strong communication and collaboration skills with the ability to work effectively in a team environment. Any relevant certifications in Snowflake or related technologies would be a plus
Posted Date not available
7.0 - 12.0 years
9 - 14 Lacs
mumbai
Work from Office
We are seeking a highly skilled Senior Snowflake Developer with expertise in Python, SQL, and ETL tools to join our dynamic team. The ideal candidate will have a proven track record of designing and implementing robust data solutions on the Snowflake platform, along with strong programming skills and experience with ETL processes. Key Responsibilities: Designing and developing scalable data solutions on the Snowflake platform to support business needs and analytics requirements. Leading the end-to-end development lifecycle of data pipelines, including data ingestion, transformation, and loading processes. Writing efficient SQL queries and stored procedures to perform complex data manipulations and transformations within Snowflake. Implementing automation scripts and tools using Python to streamline data workflows and improve efficiency. Collaborating with cross-functional teams to gather requirements, design data models, and deliver high-quality solutions. Performance tuning and optimization of Snowflake databases and queries to ensure optimal performance and scalability. Implementing best practices for data governance, security, and compliance within Snowflake environments. Mentoring junior team members and providing technical guidance and support as needed. Qualifications: Bachelor's degree in Computer Science, Engineering, or related field. 7+ years of experience working with Snowflake data warehouse. Strong proficiency in SQL with the ability to write complex queries and optimize performance. Extensive experience developing data pipelines and ETL processes using Python and ETL tools such as Apache Airflow, Informatica, or Talend. Strong Python coding experience needed minimum 2 yrs Solid understanding of data warehousing concepts, data modeling, and schema design. Experience working with cloud platforms such as AWS, Azure, or GCP. Excellent problem-solving and analytical skills with a keen attention to detail. Strong communication and collaboration skills with the ability to work effectively in a team environment. Any relevant certifications in Snowflake or related technologies would be a plus
Posted Date not available
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |