Home
Jobs
Companies
Resume

8167 Query Jobs - Page 3

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

India

Remote

Linkedin logo

Future-Able is looking for a Data Engineer, a full-time contract role, to work for: Naked & Thriving - An organic, botanical skincare brand committed to creating high-performing, naturally derived products that are as kind to the planet as they are to your skin. Our mission is to empower individuals to embrace sustainable self-care while nurturing their natural beauty. Every product we craft reflects our dedication to sustainability, quality, and transparency, ensuring our customers feel confident with every choice they make. As we rapidly grow and expand into new categories, channels, and countries, customer satisfaction remains our top priority. Job Summary: We are seeking a Data Engineer with expertise in Python, exposure to AI & Machine Learning, and a strong understanding of eCommerce analytics to design, develop, and optimize data pipelines. The ideal candidate will work on Google Cloud infrastructure, enabling advanced insights using Google Analytics (GA4). What You Will Do: ● Develop & maintain scalable data pipelines to support analytics and AI-driven models. ● Work with Python (or equivalent programming language) for data processing and transformation. ● Implement AI & Machine Learning techniques for predictive analytics and automation. ● Optimize eCommerce data insights using GA4 and Google Analytics to drive business decisions. ● Build cloud-based data infrastructure leveraging Google Cloud services like BigQuery, Pub/Sub, and Dataflow. ● Ensure data integrity and governance across structured and unstructured datasets. ● Collaborate with cross-functional teams including product managers, analysts, and marketing professionals. ● Monitor & troubleshoot data pipelines to ensure smooth operation and performance. We are looking for: ● Proficiency in Python or a similar language (e.g., Scala). ● Experience with eCommerce analytics and tracking frameworks. ● Expertise in Google Analytics & GA4 for data-driven insights. ● Knowledge of Google Cloud Platform (GCP), including BigQuery, Cloud Functions, and Dataflow. ● Experience in designing, building, and optimizing data pipelines using ETL frameworks. ● Familiarity with data warehousing concepts and SQL-based query optimization. ● Strong problem-solving and communication skills in a fast-paced environment. What will make you stand out: ● Experience with event-driven architecture for real-time data processing. ● Understanding of marketing analytics and attribution modeling. ● Previous work in a high-growth eCommerce environment. ● Exposure of AI & Machine Learning concepts and model deployment. Benefits: ● USD Salary. ● Fully Remote Work. ● USD 50 for health insurance payment. ● 30 days of pay time off per year. ● The possibility of being selected for annual bonuses based on business performance and personal achievements. Show more Show less

Posted 18 hours ago

Apply

5.0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

Title of the Position: Senior Associate (IT) (On Contract) No. of Positions: 02 (UR) (01 position for PHP (LARAVEL) profile and 01 position for POWER BI profile). Qualification: BE/B.Tech. (Computer Science Engineering/Information Technology) /M.Tech/ MCA or equivalent from a recognized university A. Senior Associate (IT), 01 position for PHP (LARAVEL) profile Experience Required: Should have at least 5 years of post-qualification experience in building and maintaining robust web applications using PHP and the Laravel framework. The candidate should have experience in critical applications, ensuring the design and implementation of scalable, secure, and high-performing applications. The following skills are desired: Strong proficiency in PHP and Laravel framework. Well versed with RESTful API development and integration. Excellent understanding of HTML, CSS, JavaScript, and jQuery. Proven experience with Oracle database management. Familiarity with Node.js, JSON, and GitHub. Knowledge of token-based authentication and data security implementation. Hands-on experience with Apache, Linux, and Docker. Practical experience in Oracle Cloud Services implementation. Preferred Skills: Attention to detail and ability to write clean, maintainable code. Strong problem-solving and troubleshooting skills. Ability to work independently and collaboratively within cross-functional teams. Experience in the ESG domain & knowledge of Postgres database and Microsoft Power BI is advantageous. Experience with CI/CD pipeline is preferred. Key Objectives and Responsibilities: Develop and maintain web applications using Laravel and PHP. Build and integrate RESTful APIs to support application functionalities. Collaborate with frontend developers to implement responsive UI components using HTML, CSS, JavaScript, and jQuery. Manage and optimize Oracle databases for performance and reliability. Integrate third-party APIs and manage secure data exchanges. Implement token-based authentication and authorization mechanisms. Apply data security best practices using Apache server configurations. Utilize GitHub for version control and collaborative development. Work with JSON for data serialization and system integration. Contribute to containerized application development using Docker. Deploy and maintain applications in Oracle Cloud Infrastructure (OCI). Work in Linux environments for development and deployment tasks. B. Senior Associate (IT), 01 position for POWER BI Experience Required: Should have at least 5 years of post-qualification experience in designing, developing, and optimizing data visualizations and business intelligence solutions using Microsoft Power BI. The following skills are desired: Should have expertise in DAX, Power Query for efficient data modelling & calculations and integration with various data sources to deliver actionable insights. Ability to optimize Power BI performance for large datasets and enterprise-scale solutions. Preferred Skills: Strong analytical and problem-solving skills to interpret complex data sets. Excellent communication and collaboration abilities to work with stakeholders and cross-functional teams. Experience in data governance and security to ensure compliance with best practices. Adaptability to evolving business requirements and emerging technologies. Mentorship skills to guide junior team members in Power BI development. Experience in PHP(Laravel) frame shall be advantageous. Experience in POSTGRES database & CI/CD implementation is plus. Practical experience in Oracle Cloud Services implementation is plus. Key Objectives and Responsibilities: Develop and maintain interactive dashboards and reports using Power BI. Design and implement data models, ensuring accuracy and efficiency. Optimize DAX queries for performance and scalability. Integrate Power BI with multiple data sources, including SQL Server and cloud-based solutions. Ensure data governance and security best practices are followed. Collaborate with teams to translate business needs into visual analytics. Provide training and support to users on Power BI functionalities. Continuously enhance Power BI solutions to improve decision-making processes. Deploy and maintain applications in Oracle Cloud Infrastructure (OCI). Proficiency in Oracle database and data integration to connect multiple sources effectively. Develop and optimize Oracle and Postgres database scripts. HOW TO APPLY: Candidates fulfilling the above eligibility criteria may submit their Resume/Biodata through email at contract@ifciltd.com. Please inscribe “Title of the position” on the subject of the e-mail . Kindly enclose the self-attested photocopies of the following documents in the email: Proof of date of Birth Educational Certificates Relevant Experience certificates (containing areas and period of service) Note: LAST DATE FOR SUBMISSION THROUGH E-MAIL IS JUNE 26, 2025. Show more Show less

Posted 18 hours ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description Role Overview:Develop efficient SQL queries and maintain views, models, and data structures across federated and transactional DB to support analytics and reporting. SQL (Advanced) Python – for data exploration and scripting Shell scripting – for lightweight automationKey Responsibilities: Write complex SQL queries for data extraction and transformations Build and maintain views, materialized views, and data models Enable efficient federated queries and optimize joins across databases Support performance tuning, indexing, and query optimization effortsPrimary: Expertise in MS SQL Server / Oracle DB / PostgresSQL , Columnar DBs like DuckDB , and federated data access Good understanding of Apache Arrow columnar data format, Flight SQL, Apache Calcite Secondary: Experience with data modelling, ER diagrams, and schema design Familiarity with reporting layer backend (e.g., Power BI datasets) Familiarity with utility operations and power distribution is preferred Experience with cloud-hosted databases is preferred Exposure to data lake in cloud ecosystems is a plusOptional Familiar with Grid CIM (Common Information Model; IEC 61970, IEC 61968) Familiarity with GE ADMS DNOM (Distribution Network Object Model) GE GridOS Data Fabric Show more Show less

Posted 18 hours ago

Apply

4.0 years

7 - 10 Lacs

Jaipur

On-site

We are seeking a highly skilled PostgreSQL Database Administrator (DBA) to join our team. The ideal candidate will have at least 4 years of hands-on experience managing PostgreSQL databases in production environments. You will be responsible for the installation, configuration, monitoring, performance tuning, backup, and security of PostgreSQL database systems to ensure high availability, reliability, and optimal performance. Key Responsibilities: · Database Administration: Install, configure, and maintain PostgreSQL database servers in development, testing, and production environments. · Performance Tuning & Optimization: Monitor database performance, identify bottlenecks, and implement tuning strategies to optimize performance. · Backup & Recovery: Implement and manage robust backup and disaster recovery solutions, including point-in-time recovery (PITR). · High Availability & Replication: Set up and manage replication, failover, and high-availability solutions such as streaming replication, Patroni, etc. · Security & Compliance: Ensure database security through access controls, encryption, auditing, and compliance with data protection regulations. · Monitoring & Alerts: Implement monitoring tools and alerting mechanisms to proactively manage database health and availability. · Upgrades & Patching: Plan and execute version upgrades, patching, and maintenance with minimal downtime. · Troubleshooting & Support: Provide expert-level support and troubleshoot database-related issues across environments. · Documentation: Maintain comprehensive documentation for configurations, procedures, and policies. · Collaboration: Work closely with DevOps, developers, and IT teams to support application requirements and deployment needs. Required Skills & Qualifications: Education: Bachelor’s degree in Computer Science, Information Technology, or related field. Experience: · 3+ years of experience in PostgreSQL database administration · Strong hands-on experience with performance tuning, query analysis, and PostgreSQL internals Technical Skills: · Expertise in PostgreSQL installation, configuration, and upgrade procedures · Proficiency in implementing replication, backup, and disaster recovery strategies · Strong knowledge of PostgreSQL security, authentication, and access control · Familiarity with Linux/Unix environments and shell scripting · Experience with monitoring tools like pgAdmin, pg_stat_statements, Prometheus + Grafana, etc. Preferred Skills (Nice to Have): · Experience with cloud-hosted PostgreSQL (AWS RDS, Azure Database for PostgreSQL, etc.) · Familiarity with automation tools like Ansible or Terraform · Knowledge of containerization and orchestration (e.g., Docker, Kubernetes) Soft Skills: · Excellent problem-solving and troubleshooting abilities · Strong documentation and communication skills · Ability to work independently and collaboratively in a fast-paced environment · Proactive and detail-oriented approach to database management Job Type: Full-time Pay: ₹700,000.00 - ₹1,000,000.00 per year Schedule: Day shift Work Location: In person

Posted 18 hours ago

Apply

4.0 years

0 Lacs

Bahraich, Uttar Pradesh, India

On-site

Linkedin logo

Job Requirements Job Requirements Role/Job Title: Branch Operations and Service Manager Function/Department: Rural Banking Job Purpose The role bearer has to focus on helping the organization to enable customers, partners and other stakeholders address their needs for proactive query resolution. It entails the responsibility of providing, setting up customer service quality procedures, standards for the team and deploy strategies, best practices to achieve it. The role bearer also has to drive of employee morale and engagement levels so that the organization is able to provide best in class service to its customers to increase customer satisfaction, loyalty and retention contributing to the larger organizational objectives of the bank. Responsibilities Roles & Responsibilities: Manage a team of customer service managers in charge of the inbound channel and correspondence branches. Providing excellent customer service and promoting customer centricity in the organization by improving customer service experience, engaging customers and facilitating organic growth. Ownership of customers issues and ensure proactive resolutions of the same. Set a clear mission of enhancing service quality and deploy strategies focused towards that mission by keeping ahead of industry’s developments and apply best practices to areas of improvement. Develop service procedures, policies and standards. Analysing MIS, enhance productivity and maintaining accurate records and document customer service actions and discussions. Recruit, mentor and develop customer service resources and nurture an environment where they can excel through encouragement and empowerment. Adherence to and manage the approved budget. Maintaining an orderly workflow according to priorities. Regulate resources and utilize assets to achieve qualitative and quantitative targets. Enhancing service quality and the level of customer focus in the organization. Leverage in-house synergies through collaboration with internal stakeholders. Education Qualification Graduation: Bachelor’s in Engineering / Technology / Math’s / Commerce / Arts / Science / Biology / Business / Computers / Management. Experience: 4+ years’ experience into Customer Service. Show more Show less

Posted 18 hours ago

Apply

3.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

DevOps Engineer (3-5 Years) Location: Lower Parel, Mumbai Expectations: Building and setting up new development tools and infrastructure. Understanding the needs of stakeholders and conveying this to developers. Working on ways to automate and improve development and release processes. Experience required: 3-5+ years of professional experience Responsibilities: Building and setting up new development tools and infrastructure Strong knowledge of AWS Strong Linux and Windows system administration background. Understanding the needs of stakeholders and conveying this to developers Working on ways to automate and improve development and release processes Improve CI/CD tooling. Implement, maintain and improve monitoring and alerting. Build and maintain highly available systems. Testing and examining code written by others and analysing results Ensuring that systems are safe and secure against cybersecurity threats Working with software developers and software engineers to ensure that development follows established processes and works as intended Assisting Product Managers with DevOps planning, execution, and query resolution Optimise infrastructure and Experience working with Docker or Kubernetes Database (MySQL, Postgres, MongoDB, etc) installation & Management Knowledge of network technologies such as TCP/IP, DNS and load balancing Must know any one programming language. Skills required: Deploy updates and fixes Proficiency with Git Optimise infrastructure costs Provide technical support Perform root cause analysis for production errors Investigate and resolve technical issues Develop scripts to automate visualisation Design procedures for system troubleshooting and maintenance Document the architecture, software used, and process followed for projects. Proficiency with at least one Infrastructure as Code (IaC) tool like Ansible, Terraform, Chef, Puppet, etc. Show more Show less

Posted 18 hours ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Dear Job Seekers, Greetings from Voice Bay! We are currently hiring for Machine Learning Engineer , If you are interested, please submit your application. Please find below the JD for your consideration: Work Location – Hyderabad Exp – 4 – 10 Years Work Mode – 5 Days Work From Office Mandatory Key Responsibilities  Design, develop, and implement end-to-end machine learning models, from initial data exploration and feature engineering to model deployment and monitoring in production environments.  Build and optimize data pipelines for both structured and unstructured datasets, focusing on advanced data blending, transformation, and cleansing techniques to ensure data quality and readiness for modeling.  Create, manage, and query complex databases, leveraging various data storage solutions to efficiently extract, transform, and load data for machine learning workflows.  Collaborate closely with data scientists, software engineers, and product managers to translate business requirements into effective, scalable, and maintainable ML solutions.  Implement and maintain robust MLOps practices, including version control, model monitoring, logging, and performance evaluation to ensure model reliability and drive continuous improvement.  Research and experiment with new machine learning techniques, tools, and technologies to enhance our predictive capabilities and operational efficiency. Required Skills & Experience  5+ years of hands-on experience in building, training, and deploying machine learning models in a professional, production-oriented setting.  Demonstrable experience with database creation and advanced querying (e.g., SQL, NoSQL), with a strong understanding of data warehousing concepts.  Proven expertise in data blending, transformation, and feature engineering, adept at integrating and harmonizing both structured (e.g., relational databases, CSVs) and unstructured (e.g., text, logs, images) data.  Strong practical experience with cloud platforms for machine learning development and deployment; significant experience with Google Cloud Platform (GCP) services (e.g., Vertex AI, BigQuery, Dataflow) is highly desirable.  Proficiency in programming languages commonly used in data science (e.g., Python is preferred, R).  Solid understanding of various machine learning algorithms (e.g., regression, classification, clustering, dimensionality reduction) and experience with advanced techniques like Deep Learning, Natural Language Processing (NLP), or Computer Vision.  Experience with machine learning libraries and frameworks (e.g., scikit-learn, TensorFlow, PyTorch).  Familiarity with MLOps tools and practices, including model versioning, monitoring, A/B testing, and continuous integration/continuous deployment (CI/CD) pipelines.  Experience with containerization technologies like Docker and orchestration tools like Kubernetes for deploying ML models as REST APIs.  Proficiency with version control systems (e.g., Git, GitHub/GitLab) for collaborative development. Educational Background  Bachelor's or Master's degree in Statistics, Mathematics, Computer Science, Engineering, Data Science, or a closely related quantitative field.  Alternatively, a significant certification in Data Science, Machine Learning, or Cloud AI combined with relevant practical experience will be considered.  A compelling combination of relevant education and professional experience will also be valued. Interested Candidates can share their Resume to the below mentioned Email I.D tarunrai@voicebaysolutions.in hr@voicebaysolutions.in Show more Show less

Posted 18 hours ago

Apply

0 years

0 - 0 Lacs

India

On-site

Roles and Responsibilities:- Building and maintaining profitable relationships with key customers. Overseeing the relationship with customers handled by your team. Resolving customer query quickly and efficiently. Understanding key customer individual needs and addressing these. A team player with leadership skills. Maintain a positive attitude focused on customer satisfaction. Skills Required:- Should be good in communication both in Verbal Communication Self-motivated Sound Knowledge of English and Hindi. We are also accepting applications for this profile from customer service representative, customer service, customer support, customer care executive, bpo. Job Types: Full-time, Permanent, Fresher Pay: ₹11,000.00 - ₹18,500.00 per month Schedule: Day shift Evening shift Night shift Rotational shift Work Location: In person Speak with the employer +91 9303315362

Posted 18 hours ago

Apply

5.0 - 10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Role : Data Analyst- Mongo DB We are looking for 5-10 years of experience in database development, with a strong knowledge of database management systems such as Mongo Database and NoSQL. Technical Skills: · 5-10 years of database development experience · Strong proficiency in MongoDB and NoSQL database management. · Experience with database design, indexing, and query optimization. · Knowledge of aggregation frameworks and data modeling. · Familiarity with MongoDB Atlas and cloud-based database solutions. · Understanding of database security, authentication, and encryption. · Strong problem-solving and analytical skills. · Familiarity with performance monitoring tools and database tuning techniques. · Excellent communication and team collaboration abilities. * Proficiency in Oracle / PL SQL development Roles & Responsibilities · Design and implement MongoDB database structures for optimal performance. · Develop and optimize queries, indexes, and data aggregation pipelines. · Ensure database security, backups, and failover strategies. · Implement scalability solutions such as sharding and replication. · Work with backend developers to integrate MongoDB with applications. · Optimize read and write operations for large-scale databases. · Monitor database performance and troubleshoot issues. · Ensure data integrity and consistency across multiple environments. · Automate database deployment and maintenance tasks. · Stay updated with MongoDB best practices and new features. Qualification: · Somebody who has at least 5-10 years of Mongo database development experience. · Education qualification: Any degree from a reputed college Show more Show less

Posted 18 hours ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Summary Your role in our mission Essential Job Functions Contributes to test planning, scheduling, and managing test resources; leads formal test execution phases on larger projects. Defines test cases and creates integration and system test scripts and configuration test questionnaires from functional requirement documents. Executes functional tests and authors significant revisions to test materials as necessary through the dry run and official test phases. Maintains defect reports and updates reports following regression testing. Adheres to and advocates use of established quality methodology and escalates issues as appropriate. Understands the functional design of software products / suites being tested and their underlying technologies to facilitate authoring testware, diagnosing system issues, and ensuring that tests accurately address required business functionality. Clarifies ambiguous areas with technical teams. Applies basic industry and functional area knowledge related to the software product being tested and applicable regulatory statutes to determine whether system components meet business specifications. Develops specified testing deliverables over the lifecycle of the project. What we're looking for Bachelor's degree or equivalent combination of education and experience Bachelor's degree in business, mathematics, engineering, management information systems, or computer science, or related field preferred Three or more years of software testing experience Experience working with developing testware from functional design documents and executing testware against a schedule and in compliance with a methodology Experience working with configuration management, defect tracking, query tools, software productivity tools, and templates used to create test scripts, trace matrices, etc. Experience working with software product testing and applicable regulatory statutes Other Qualifications Good organization, people management and time management skills Good analytical and problem solving skills Good personal computer and business solutions software skills Good communication skills to interact and present findings to team members Good planning skills Good consulting skills; can effectively interact with client during project team teleconferences and on-site meetings Ability to write lengthy procedural, step-based narrative test materials including the necessary testbed set-up steps Ability to work cooperatively as a part of a global professional team that may be distributed across geographies and time zones Ability to complete assigned responsibilities independently in a given timeframe with minimal managerial and technical support Willingness to travel What you should expect in this role Hybrid environment May require evening or weekend work Show more Show less

Posted 19 hours ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Job Title: KDB+ Developer Experience: 3-5 Years Location: Gurgaon and Chennai Work Model: Hybrid - 2 days a week in office. Join our Tick Data Analytics Platform Team, focused on developing solutions for our strategic KDB+ Platform. Work closely with various trading platforms and compliance functions. Analyze and manipulate substantial datasets in a fast-paced, low-latency environment. Key Responsibilities: Handle data from various business areas. Translate requirements into user analytics. Design solutions supporting regulatory initiatives, pricing/trading algorithms, back testing, and PnL attribution. Prototype solutions rapidly in an agile manner for Front Office teams. Participate in all aspects of product delivery, including design documents, functional specifications, and architecture. Contribute toward enhancing the evolving KDB architecture, including the newly built Golden data store (Order Data store). Essential Skills: Excellent understanding of KDB/Q. Experience with a tick data capture and analytics platform. In-depth knowledge of KDB engineering concepts (data partitioning, organization for query performance, Realtime streaming, IPC). Good understanding of Data Structures and Algorithms. Ability to work independently with limited guidance. Ability to liaise directly with business stakeholders. Understanding of development tools like JIRA, GIT, Jenkins etc. Ability to design and build components independently. Ability to prototype solutions quickly and multitask deliveries in an agile environment. Show more Show less

Posted 19 hours ago

Apply

175.0 years

0 Lacs

Bengaluru South, Karnataka, India

On-site

Linkedin logo

At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. The Financial Data Sourcing Analyst is a critical role in Financial Reporting Quality Assurance Organization (FRQA) within Corporate Controllership, in support of the Regulatory Reporting Automation program. This role is responsible for driving the definition, gathering, exploration, and analysis of finance data and its data sources to deliver the end-to-end automation for our regulatory reporting platforms. The Data Sourcing Architect team oversees the data mapping, profiling and source to target (S2T) analysis mapping for new data requirements of our regulatory reporting automation and systems, as well as leading the coordination / orchestration of the close partnership between Product Owners, Report / Business Owners, Technology and Data Testing teams to determine data / product features of the data solution are put into production with the highest degree of confidence of the data flow and data system requirements or the deactivation / decommission of data sets of financial data systems. The Data Sourcing Analyst must be a highly analytical, well-organized, data-driven individual wit time management and a high degree of technical skills confident in presenting, highly complex data concepts and technicalities in simple terms and pragmatically to the team and associated stakeholders. How will you make an impact in this role? Key Responsibilities: Collaborate with business stakeholders to understand the data needs for regulatory reporting, Translate the business requirements into technical specifications for data solutions. Develop and implement data management strategies for Regulatory Reporting Data Domain(RRDD), Design and maintain RRDD data models to support regulatory reporting and ensure its scalable and flexible. Partner with business, upstream and Technology teams to implement data solutions for regulatory reporting, Monitor and optimize data systems for performance and efficiency Collaborate with data governance team to define standards and ensure data quality and consistency in RRDD Data sourcing gap analysis and profiling of attributes and complete source to target mapping document for regulatory reports automation Conduct data analysis on existing processes and datasets to understand and support Point of Arrival (POA) process designs including migration of RRDD tables from On Prem to BigQuery. Experience to determine portfolios, data elements and grain of data required for designing processes to review data scenarios, providing clarification on how to report on these scenarios in alignment with regulatory guidance. Support development of executable data querying algorithms using tools such as SQL, that enable validation data conformity and expected data system functionality, including replication of deterministic logic and filtering criteria for master and reference data to be used across operational and regulatory reporting processes. Identification of business requirements and development of functional requirement documentation for new data sources and attributes, including design, prototyping, testing, and implementation of report owner and regulatory requirements. Document and understand core components of solution architecture including data patterns, data-related capabilities, and standardization and conformance of disparate datasets. Minimum Qualifications 3+ years of work experience in Data Sourcing and analysis. 3+ years of work experience in Banking / Regulatory / Financial / Technology Services. Product Management, data migration, data analytics working experience is a plus. Experienced in Agile delivery concepts or other project management methodologies. Experience in data analytics, data profiling, Source to Target (S2T) data mapping, analyzing the System of Record (SOR) data and its Data Quality (DQ) rules to identify data issues in SORs. Strong SQL / NoSQL and data analysis experience, able to write / understand complex SQL, HiveQL, with hands-on Oracle SQL and Hive experience. Experience with of MS Excel, Power Query, and other analytical tools, e.g., Tableau. Experience with Python, Google Big Query, PL SQL. Critical thinking and complex problem-solving skills (data application). Excellent written and verbal communications with ability to communicate highly complex concepts and processes in simple terms and pragmatically. A self-starter, proactive team player with excellent relationship building and collaboration skills, facilitating a network of strong relationships across the organization. Preferred Qualifications Knowledge of US Regulatory Reports (Y9C, Y14, Y15, 2052a, amongst others), and general understanding of the banking products. Working exposure in data analysis financial data domains to support regulatory and analytical requirements for large scale banking/financial organizations Experience in Google Cloud capabilities We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less

Posted 19 hours ago

Apply

8.0 years

0 Lacs

Greater Hyderabad Area

On-site

Linkedin logo

Data Test Engineer (Snowflake, Python, DB Testing, SQL) Job Location: - Hyderabad, Pune Or Gurugram Experience Level: Mid-Senior (8+ Years) About Kellton: We are a global IT services and digital product design and development company with subsidiaries that serve startup, mid-market, and enterprise clients across diverse industries, including Finance, Healthcare, Manufacturing, Retail, Government, and Nonprofits. At Kellton, we believe that our people are our greatest asset. We are committed to fostering a culture of collaboration, innovation, and continuous learning. Our core values include integrity, customer focus, teamwork, and excellence. To learn more about our organization, please visit us at www.kellton.com Are you craving a dynamic and autonomous work environment? If so, this opportunity may be just what you're looking for. At our company, we value your critical thinking skills and encourage your input and creative ideas to supply the best talent available. To boost your productivity, we provide a comprehensive suite of IT tools and practices backed by an experienced team to work with. About the Role: We are seeking a highly skilled DBT Engineer to take charge of migrating ETL pipelines from SSIS (SQL Server Integration Services) to DBT (Data Build Tool) and developing new DBT pipelines. The candidate will have strong experience in building data pipelines, working with SQL Server, and leveraging DBT for data transformation. This role requires deep knowledge of modern data platforms and the ability to ensure a smooth transition from legacy systems to modern architectures. What you will do: Develop and execute unit tests for migrated SQL Server stored procedures rewritten in Snowflake SQL to ensure functionality and performance are consistent with the original logic. Validate data accuracy, consistency, and performance in the new Snowflake environment. Create and execute unit tests and integration tests for DBT pipelines, ensuring that transformations are accurate, reliable, and performant. Implement DBT test suites to verify the correctness of data models, transformation logic, and data quality. Validate the output of scripts against expected results and ensure efficient processing in the AWS/Snowflake cloud infrastructure. Perform end-to-end testing for the entire data flow, from DBT pipelines and Snowflake transformations to report generation via Python/R scripts. Required Skills and Qualifications: Proven experience in testing data pipelines and ETL processes using DBT or similar tools. Experience testing SQL transformations, particularly migrating from SQL Server to Snowflake SQL. Hands-on experience with testing Python and R scripts, particularly in data processing or analytical environments. Experience working in data migration projects, particularly to cloud environments like Snowflake. Strong knowledge of SQL for writing test cases and validating query results. Experience with DBT testing frameworks, including setting up and running DBT tests. Proficiency in Python and R for testing and debugging data transformation scripts. Familiarity with version control systems (e.g., Git) for managing test scripts and test case versioning. Knowledge of CI/CD pipelines for integrating automated testing into the development process. Familiarity with cloud platforms (AWS) and cloud-native databases like Snowflake. Experience in performance testing and load testing for SQL queries, pipelines, and scripts running on large datasets. What we offer you: · Existing clients in multiple domains to work. · Strong and efficient team committed to quality output. · Enhance your knowledge and gain industry domain expertise by working in varied roles. · A team of experienced, fun, and collaborative colleagues · Hybrid work arrangement for flexibility and work-life balance (If the client/project allows) · Competitive base salary and job satisfaction. Join our team and become part of an exciting company where your expertise and ideas are valued, and where you can make a significant impact in the IT industry. Apply today! Interested applicants, please submit your detailed resume stating your current and expected compensation and notice period to srahaman@kellton.com Show more Show less

Posted 19 hours ago

Apply

7.0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

Role Expectations: Data Collection and Cleaning: Collect, organize, and clean large datasets from various sources (internal databases, external APIs, spreadsheets, etc.). Ensure data accuracy, completeness, and consistency by cleaning and transforming raw data into usable formats. Data Analysis: Perform exploratory data analysis (EDA) to identify trends, patterns, and anomalies. Conduct statistical analysis to support decision-making and uncover insights. Use analytical methods to identify opportunities for process improvements, cost reductions, and efficiency enhancements. Reporting and Visualization: Create and maintain clear, actionable, and accurate reports and dashboards for both technical and non-technical stakeholders. Design data visualizations (charts, graphs, and tables) that communicate findings effectively to decision-makers. Worked on PowerBI , Tableau and Pythoin Libraries for Data visualization like matplotlib , seaborn , plotly , Pyplot , pandas etc Experience in generating the Descriptive , Predictive & prescriptive Insights with Gen AI using MS Copilot in PowerBI. Experience in Prompt Engineering & RAG Architectures Prepare reports for upper management and other departments, presenting key findings and recommendations. Collaboration: Work closely with cross-functional teams (marketing, finance, operations, etc.) to understand their data needs and provide actionable insights. Collaborate with IT and database administrators to ensure data is accessible and well-structured. Provide support and guidance to other teams regarding data-related questions or issues. Data Integrity and Security: Ensure compliance with data privacy and security policies and practices. Maintain data integrity and assist with implementing best practices for data storage and access. Continuous Improvement: Stay current with emerging data analysis techniques, tools, and industry trends. Recommend improvements to data collection, processing, and analysis procedures to enhance operational efficiency. Qualifications: Education: Bachelor's degree in Data Science, Statistics, Computer Science, Mathematics, or a related field. A Master's degree or relevant certifications (e.g., in data analysis, business intelligence) is a plus. Experience: Proven experience as a Data Analyst or in a similar analytical role (typically 7+ years). Experience with data visualization tools (e.g., Tableau, Power BI, Looker). Strong knowledge of SQL and experience with relational databases. Familiarity with data manipulation and analysis tools (e.g., Python, R, Excel, SPSS). Worked on PowerBI , Tableau and Pythoin Libraries for Data visualization like matplotlib , seaborn , plotly , Pyplot , pandas etc Experience with big data technologies (e.g., Hadoop, Spark) is a plus. Technical Skills: Proficiency in SQL and data query languages. Knowledge of statistical analysis and methodologies. Experience with data visualization and reporting tools. Knowledge of data cleaning and transformation techniques. Familiarity with machine learning and AI concepts is an advantage (for more advanced roles). Soft Skills: Strong analytical and problem-solving abilities. Excellent attention to detail and ability to identify trends in complex data sets. Good communication skills to present data insights clearly to both technical and non-technical audiences. Ability to work independently and as part of a team. Strong time management and organizational skills, with the ability to prioritize tasks effectively. Show more Show less

Posted 19 hours ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Description Amazon Transportation team is looking for an innovative, hands-on and customer-obsessed Business Analyst for Analytics team. Candidate must be detail oriented, have superior verbal and written communication skills, strong organizational skills, excellent technical skills and should be able to juggle multiple tasks at once. Ideal candidate must be able to identify problems before they happen and implement solutions that detect and prevent outages. The candidate must be able to accurately prioritize projects, make sound judgments, work to improve the customer experience and get the right things done. This job requires you to constantly hit the ground running and have the ability to learn quickly. Primary responsibilities include defining the problem and building analytical frameworks to help the operations to streamline the process, identifying gaps in the existing process by analyzing data and liaising with relevant team(s) to plug it and analyzing data and metrics and sharing update with the internal teams. Key job responsibilities Apply multi-domain/process expertise in day to day activities and own end to end roadmap. Translate complex or ambiguous business problem statements into analysis requirements and maintain high bar throughout the execution. Define analytical approach; review and vet analytical approach with stakeholders. Proactively and independently work with stakeholders to construct use cases and associated standardized outputs Scale data processes and reports; write queries that clients can update themselves; lead work with data engineering for full-scale automation Have a working knowledge of the data available or needed by the wider business for more complex or comparative analysis Work with a variety of data sources and Pull data using efficient query development that requires less post processing (e.g., Window functions, virt usage) When needed, pull data from multiple similar sources to triangulate on data fidelity Actively manage the timeline and deliverables of projects, focusing on interactions in the team Provide program communications to stakeholders Communicate roadblocks to stakeholders and propose solutions Represent team on medium-size analytical projects in own organization and effectively communicate across teams A day in the life Solve ambiguous analyses with less well-defined inputs and outputs; drive to the heart of the problem and identify root causes Have the capability to handle large data sets in analysis through the use of additional tools Derive recommendations from analysis that significantly impact a department, create new processes, or change existing processes Understand the basics of test and control comparison; may provide insights through basic statistical measures such as hypothesis testing Identify and implement optimal communication mechanisms based on the data set and the stakeholders involved Communicate complex analytical insights and business implications effectively About The Team AOP (Analytics Operations and Programs) team is missioned to standardize BI and analytics capabilities, and reduce repeat analytics/reporting/BI workload for operations across IN, AU, BR, MX, SG, AE, EG, SA marketplace. AOP is responsible to provide visibility on operations performance and implement programs to improve network efficiency and defect reduction. The team has a diverse mix of strong engineers, Analysts and Scientists who champion customer obsession. We enable operations to make data-driven decisions through developing near real-time dashboards, self-serve dive-deep capabilities and building advanced analytics capabilities. We identify and implement data-driven metric improvement programs in collaboration (co-owning) with Operations teams. Basic Qualifications 2+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with one or more industry analytics visualization tools (e.g. Excel, Tableau, QuickSight, MicroStrategy, PowerBI) and statistical methods (e.g. t-test, Chi-squared) Experience with scripting language (e.g., Python, Java, or R) Preferred Qualifications Master's degree, or Advanced technical degree Knowledge of data modeling and data pipeline design Experience with statistical analysis, co-relation analysis Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ASSPL - Karnataka Job ID: A3009286 Show more Show less

Posted 19 hours ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

At Medtronic you can begin a life-long career of exploration and innovation, while helping champion healthcare access and equity for all. You’ll lead with purpose, breaking down barriers to innovation in a more connected, compassionate world. A Day in the Life As a Power BI Developer, where you will create dynamic, data-driven dashboards and reports that provide meaningful insights for financial and business decision-making. You will work closely with Finance, Data Science, and Engineering teams to develop interactive visualizations that drive data accessibility. Data Visualization (Power BI) Developer – Global Finance Analytics COE Careers that Change Lives Join our Global Finance Analytics Center of Excellence (COE) as a Power BI Developer , where you will create dynamic, data-driven dashboards and reports that provide meaningful insights for financial and business decision-making. You will work closely with Finance, Data Science, and Engineering teams to develop interactive visualizations that drive data accessibility. This role requires an average of 2-3 days per week of overlapping work hours with the USA team to ensure seamless collaboration. A Day in the Life As a Power BI Developer , you will: Design and develop Power BI dashboards and reports with intuitive user experiences. Optimize data models, ensuring performance efficiency and best practices in DAX, M Query, and data transformations. Integrate data from Snowflake, SQL databases, and enterprise systems for analytics and reporting. Collaborate with stakeholders to understand business needs and translate them into actionable visual solutions. Ensure data governance, security, and role-based access controls in reporting solutions. Automate reporting processes and drive self-service BI adoption within Finance and Business teams. Stay up to date with emerging trends in BI, data visualization, and cloud analytics. Must Have: Minimum Requirements Bachelor’s degree in Computer Science, Information Systems, Business Analytics, or a related field. 5+ years of experience developing Power BI dashboards and reports. Strong proficiency in DAX, Power Query (M), and SQL. Experience integrating Power BI with cloud platforms (Azure, Snowflake, or AWS). Strong data modeling skills and performance tuning expertise. Ability to interpret business requirements and translate them into compelling data visualizations. Nice to Have Experience with Python and AI-powered analytics in Power BI. Knowledge of financial reporting and forecasting dashboards. Understanding of SAP, OneStream, or other ERP systems for financial reporting. Physical Job Requirements The above statements are intended to describe the general nature and level of work being performed by employees assigned to this position, but they are not an exhaustive list of all the required responsibilities and skills of this position. Benefits & Compensation Medtronic offers a competitive Salary and flexible Benefits Package A commitment to our employees lives at the core of our values. We recognize their contributions. They share in the success they help to create. We offer a wide range of benefits, resources, and competitive compensation plans designed to support you at every career and life stage. About Medtronic We lead global healthcare technology and boldly attack the most challenging health problems facing humanity by searching out and finding solutions. Our Mission — to alleviate pain, restore health, and extend life — unites a global team of 95,000+ passionate people. We are engineers at heart— putting ambitious ideas to work to generate real solutions for real people. From the R&D lab, to the factory floor, to the conference room, every one of us experiments, creates, builds, improves and solves. We have the talent, diverse perspectives, and guts to engineer the extraordinary. Learn more about our business, mission, and our commitment to diversity here Show more Show less

Posted 19 hours ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Company Description: About Holiday Tribe Holiday Tribe is a Great Place To Work® Certified™, seed-stage VC-funded travel-tech brand based in Gurugram. We specialize in crafting unforgettable leisure travel experiences by integrating advanced technology, leveraging human expertise, and prioritizing customer success. With holidays curated across 30+ destinations worldwide, partnerships with renowned tourism boards, and recognition as the Emerging Holiday Tech Company at the India Travel Awards 2023, Holiday Tribe is transforming the travel industry. Our mission is to redefine how Indians experience holidays making travel planning faster, smarter, and more personalized, ensuring every trip is truly seamless and unforgettable. Key Responsibilities AI System Development Design and implement Retrieval Augmented Generation (RAG) systems for travel recommendation and itinerary planning Build and optimize large language model integrations using frameworks like Lang Chain for travel-specific use cases Develop semantic search capabilities using vector databases and embedding models for travel content discovery Create tool-calling architectures that enable AI agents to interact with booking systems, inventory APIs, and external travel services Implement intelligent conversation flows for customer interactions and sales assistance Travel Intelligence Platform Build personalized recommendation engines that understand traveler preferences, seasonal factors, and destination characteristics Develop natural language processing capabilities for interpreting customer travel requests and preferences Implement real-time itinerary generation systems that consider multiple constraints (budget, time, preferences, availability) Create AI-powered tools to assist travel experts in creating customized packages faster Build semantic search engines for finding relevant travel content based on user intent and contextual understanding AI Agent & Tool Integration Design and implement function calling systems that allow LLMs to execute actions like booking confirmations, inventory checks, and pricing queries Build multi-agent systems where specialized AI agents handle different aspects of travel planning (accommodation, transportation, activities) Create tool orchestration frameworks that enable AI systems to chain multiple API calls for complex travel operations Implement safety and validation layers for AI-initiated actions in critical systems Data & Model Operations Work with travel knowledge graphs to enhance AI understanding of destinations, accommodations, and activities Implement hybrid search systems combining semantic similarity with traditional keyword-based search Build vector indexing strategies for efficient similarity search across large travel content databases Implement model evaluation frameworks to ensure high-quality AI outputs Optimize AI model performance for cost-efficiency and response times Collaborate with data engineers to build robust data pipelines for AI training and inference Cross-functional Collaboration Partner with product teams to translate travel domain requirements into AI capabilities Work closely with backend engineers to integrate AI services into the broader platform architecture Collaborate with UX teams to design intuitive AI-human interaction patterns Support sales and customer success teams by improving AI assistant capabilities Required Qualifications Technical Skills 3+ years of experience in AI/ML engineering with focus on natural language processing and large language models Strong expertise in RAG (Retrieval Augmented Generation) systems including vector databases, embedding models, and retrieval strategies Hands-on experience with Lang Chain or similar LLM orchestration frameworks, including tool calling and agent patterns Proficiency with semantic search technologies including vector databases, embedding models, and similarity search algorithms Experience with tool calling and function calling in LLM applications, including API integration and action validation Proficiency with major LLM APIs (Open AI, Anthropic, Google, etc.) and understanding of prompt engineering best practices Experience with vector databases such as Milvus, Weaviate, Chroma, or similar solutions Strong Python programming skills with experience in AI/ML libraries (transformers, sentence-transformers, scikit-learn) AI/ML Foundation Solid understanding of transformer architectures, attention mechanisms, and modern NLP techniques Deep knowledge of embedding models and semantic similarity techniques (sentence transformers, dense retrieval methods) Experience with hybrid search architectures combining dense and sparse retrieval methods Knowledge of fine-tuning approaches and model adaptation strategies Understanding of agent-based AI systems and multi-step reasoning capabilities Understanding of AI evaluation metrics and testing methodologies Familiarity with MLOps practices and model deployment strategies Software Engineering Experience building production-grade AI applications with proper error handling and monitoring Experience with API integration and orchestration for complex multi-step workflows Understanding of API design and microservices architecture Familiarity with cloud platforms (AWS, GCP, Azure) and their AI/ML services Experience with version control, CI/CD, and collaborative development practices Preferred Qualifications Advanced AI Experience Experience with multi-modal AI systems (text, images, structured data) Advanced knowledge of agent frameworks (LangGraph, CrewAI, AutoGen) and agentic workflows Experience with advanced semantic search techniques including re-ranking, query expansion, and result fusion Experience with model fine-tuning, especially for domain-specific applications Knowledge of tool use optimization and function calling best practices Understanding of AI safety, bias mitigation, and responsible AI practices Technical Depth Experience with advanced RAG techniques (hybrid search, re-ranking, query expansion, contextual retrieval) Knowledge of vector search optimization including indexing strategies, similarity metrics, and performance tuning Experience building tool-calling systems that integrate with external APIs and services Knowledge of graph databases and knowledge graph construction Familiarity with conversational AI and dialogue management systems Experience with A/B testing frameworks for AI systems Technical Challenges You'll Work On Building semantic search that understands travel intent ("romantic getaway" vs "adventure trip" vs "family vacation") Creating AI agents that can book multi-leg journeys by coordinating with multiple service providers Implementing tool calling systems that safely execute real booking actions with proper validation Designing RAG systems that provide accurate, up-to-date travel information from diverse sources Building conversational AI that can handle complex travel planning requirements Show more Show less

Posted 19 hours ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title: Dynamics 365 .Net (.NET 8 + Azure AD B2C + SQL + Power Platform) Location: Hyderabad Experience Required: 8–10 years Engagement Type: Full-time / Contract Core Responsibilities: Backend (.NET 8 APIs): Develop RESTful services using .NET 8 with role-based access. Secure endpoints with OAuth2 and implement RBAC across APIs. Integrate with Dynamics 365, Azure SQL, and Power Automate. Identity Management (Azure AD B2C): Configure sign-up/sign-in policies and federated identity with Google, Apple, LinkedIn, etc. Handle access tokens, session handling, and claims transformation. Coordinate identity flows across mobile and web apps. Data Layer (Azure SQL): Design normalized schema for transactional data. Write performant SQL queries, stored procedures, and indexing strategies. Ensure data encryption at rest and in transit. Support audit logging and access logging via SQL telemetry. Integration (Power Automate + Dynamics 365): Build workflows for financial tracking, CRM entity updates, and external integrations. Use custom connectors or HTTP actions to integrate with .NET APIs. Customize Dynamics 365 entities and business rules as required by workflows. Manage data consistency and flow across Power Platform and backend. Technical Skills Required: Area Skill Backend .NET Core / .NET 8, REST APIs, RBAC Identity Azure AD B2C, OAuth2, OpenID Connect, External IdP Federation Database Azure SQL, T-SQL, schema design, query optimization CRM/Workflow Power Automate, Dynamics 365 customization, custom flows DevOps Git, Azure DevOps (pipelines, releases), CI/CD familiarity Security SSL/TLS, token-based access, encrypted data handling Minimum Requirements: Experience building scalable .NET Core / .NET 8 applications. Hands-on with Azure AD B2C setup and federation scenarios. Proficiency in Azure SQL database development and optimization. Experience with Power Platform (especially Power Automate) and Dynamics 365. Ability to work independently across components with clean integration boundaries. Shanmukh, Shanmukh.siva@navasoftware.com Show more Show less

Posted 19 hours ago

Apply

1.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Description Amazon’s global fulfillment network enables any merchant to ship items that are ordered on Amazon to any place on earth. There is a complex network of ways in which items move between vendor locations, Amazon warehouses, and customer locations as well as several intermediate locations through which packages travel before reaching the customer. With a scale of millions of packages, each with different attributes and delivery requirements, what results is a highly dense graph of nodes. We have built a highly respected software engineering team which is focused on solving complex problems in worldwide transportation using workflows, optimization algorithms, and machine learning systems. These are large-scale distributed systems handling millions of packages being shipped through the Amazon logistics network. You will be working with senior SDEs and principals to solve problems of scale, improve existing services & build new ones, and work on deep and complex algorithms to improve the experience of our customers globally while optimizing network operations. Key job responsibilities Collaborate with experienced cross-disciplinary Amazonians to conceive, design, and bring innovative products and services to market. Design and build innovative technologies in a large distributed computing environment and help lead fundamental changes in the industry. Create solutions to run predictions on distributed systems with exposure to innovative technologies at incredible scale and speed. Build distributed storage, index, and query systems that are scalable, fault-tolerant, low cost, and easy to manage/use. Design and code the right solutions starting with broadly defined problems. Work in an agile environment to deliver high-quality software. Basic Qualifications 1+ years of non-internship professional software development experience Experience programming with at least one software programming language Preferred Qualifications Bachelor's degree in computer science or equivalent Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI HYD 13 SEZ - H84 Job ID: A3009283 Show more Show less

Posted 19 hours ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Overview: The person will be responsible for expanding and optimizing our data and data pipeline architecture. The ideal candidate is an experienced data pipeline builder who enjoys optimizing data systems and building them from the ground up. You’ll be Responsible for ? Create and maintain optimal data pipeline architecture, assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and Cloud technologies. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics. Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. Work with data and analytics experts to strive for greater functionality in our data systems. You’d have? We are looking for a candidate with 3+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools: Experience with data pipeline and workflow management tools: Apache Airflow, NiFi, Talend etc. • Experience with relational SQL and NoSQL databases, including Clickhouse, Postgres and MySQL. Experience with stream-processing systems: Storm, Spark-Streaming, Kafka etc. Experience with object-oriented/object function scripting languages: Python, Scala, etc. Experience building and optimizing data pipelines, architectures and data sets. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Strong analytic skills related to working with unstructured datasets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Working knowledge of message queuing, stream processing, and highly scalable data stores Why Join us? Impactful Work: Play a pivotal role in safeguarding Tanla's assets, data, and reputation in the industry. Tremendous Growth Opportunities: Be part of a rapidly growing company in the telecom and CPaaS space, with opportunities for professional development. Innovative Environment: Work alongside a world-class team in a challenging and fun environment, where innovation is celebrated. Tanla is an equal opportunity employer. We champion diversity and are committed to creating an inclusive environment for all employees. www.tanla.com Show more Show less

Posted 19 hours ago

Apply

1.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Description Amazon’s global fulfillment network enables any merchant to ship items that are ordered on Amazon to any place on earth. There is a complex network of ways in which items move between vendor locations, Amazon warehouses, and customer locations as well as several intermediate locations through which packages travel before reaching the customer. With a scale of millions of packages, each with different attributes and delivery requirements, what results is a highly dense graph of nodes. We have built a highly respected software engineering team which is focused on solving complex problems in worldwide transportation using workflows, optimization algorithms, and machine learning systems. These are large-scale distributed systems handling millions of packages being shipped through the Amazon logistics network. You will be working with senior SDEs and principals to solve problems of scale, improve existing services & build new ones, and work on deep and complex algorithms to improve the experience of our customers globally while optimizing network operations. Key job responsibilities Collaborate with experienced cross-disciplinary Amazonians to conceive, design, and bring innovative products and services to market. Design and build innovative technologies in a large distributed computing environment and help lead fundamental changes in the industry. Create solutions to run predictions on distributed systems with exposure to innovative technologies at incredible scale and speed. Build distributed storage, index, and query systems that are scalable, fault-tolerant, low cost, and easy to manage/use. Design and code the right solutions starting with broadly defined problems. Work in an agile environment to deliver high-quality software. Basic Qualifications 1+ years of non-internship professional software development experience Experience programming with at least one software programming language Preferred Qualifications Bachelor's degree in computer science or equivalent Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI HYD 13 SEZ - H84 Job ID: A3009288 Show more Show less

Posted 19 hours ago

Apply

4.0 years

0 Lacs

India

Remote

Linkedin logo

Wobot AI is hiring a Senior Backend Developer (Node.js + ClickHouse) to help build the data backbone of our automation and vision intelligence platform. Explore the details below and see if you’re the right fit! What you'll do: Design and implement ingestion pipelines into ClickHouse for Computer Vision and other high-volume structured insights. Model efficient, scalable schemas using MergeTree, ReplacingMergeTree, and appropriate partitioning strategies. Implement deduplication, version control, and update-safe ingestion strategies tailored for real-time and mutable data. Build and maintain backend services and APIs that expose ClickHouse data to other systems such as product dashboards and internal workflows. Collaborate with CV and backend teams to ensure seamless data flow, system integration, and ingestion resilience. Work with product and data consumers to support high-performance analytical queries and structured data access. Monitor and maintain ingestion health, performance, observability, and error handling across the pipeline. Contribute to future-facing system design that enables AI agent integration, context-aware workflows, and evolving protocols such as MCP. What we are looking for: Must Have: 4 to 6 years of backend development experience with strong proficiency in Node.js. At least 1 year of production-grade experience with ClickHouse, including schema design and performance tuning. Experience building data pipelines using RabbitMQ, Pub/Sub, or other messaging systems. Solid understanding of time-series data, analytical query patterns, and distributed ingestion design. Familiarity with Google Cloud Platform and serverless development practices. Good to have: Experience with TypeScript in production backend systems. Exposure to building serverless applications using Cloud Run or AWS Lambda. Experience working with materialized views, TTL-based retention, and ingestion optimization in ClickHouse. Prior experience with Computer Vision pipelines or real-time data flows. Awareness of modern backend patterns that support AI/ML-generated insights, structured data orchestration, and agent-based interactions. Familiarity with designing systems that could interface with evolving protocols such as MCP or context-rich feedback systems. How we work: We use Microsoft Teams for daily communication, conduct daily standups and team meetings over Teams. We value open discussion, ownership, and a founder mindset. We prioritize design, amazing UI/UX, documentation, to-do lists, and data-based decision-making. We encourage team bonding through bi-weekly town halls, destressing sessions with a certified healer, and fun company retreats twice a year. We offer a 100% remote workplace model, health insurance, top performers eligible for attractive equity options, mental health consultations, company-sponsored upskilling courses, growth hours, the chance to give back with 40 hours for community causes, and access to a financial advisor. Wobot is an Equal Opportunity Employer If you have a passion for developing innovative solutions and want to work on cutting-edge technology, we encourage you to apply for this exciting opportunity. Show more Show less

Posted 19 hours ago

Apply

1.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

We are seeking a highly motivated and detail-oriented Customer Service Representative to join our team. In this role, you will be responsible for handling customer inquiries, issues, and feedback via email. The ideal candidate will have excellent written communication skills, a strong understanding of customer service principles, and the ability to provide timely and effective solutions to customer concerns. Job Responsibilities – Strong command over written English (grammatically correct written English) – Customer Support – Assist the customer process and solve Customer queries via email or chat. – Ensuring customer satisfaction through prompt responses and query resolution – Coordinate with internal team for issue resolving & coordination with customer – To ensure that tasks are completed daily and in a timely manner – Respond to customers in tickets issues, follow-ups, etc. Mandatory Requirements Should be ready to work in Night shifts and 5 days working. Should be ready to Work from Office Soft Skills: Excellent verbal and written communication skills Should possess prioritizing and organizing skills Logical and reasoning abilities to take right decisions to resolve customer’s issues Result oriented who can drive towards stringent targets Basic knowledge of Customer Services Should be punctual and responsible with work Should have the passion to learn and grow Required only 1 to 2 years of experience Freshers can also apply Email ID - shivam.chondhe@binated.com Whatsapp/Call - +91 87670 01029 Show more Show less

Posted 19 hours ago

Apply

14.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Requirements Description and Requirements Position Summary: A highly skilled Big Data (Hadoop) Administrator responsible for the installation, configuration, engineering, and architecture of Cloudera Data Platform (CDP) and Cloudera Flow Management (CFM) streaming clusters on RedHat Linux. Strong expertise in DevOps practices, scripting, and infrastructure-as-code for automating and optimizing operations is highly desirable. Experience in collaborating with cross-functional teams, including application development, infrastructure, and operations, is highly preferred. Job Responsibilities: Manages the design, distribution, performance, replication, security, availability, and access requirements for large and complex Big Data clusters. Designs and develops the architecture and configurations to support various application needs; implements backup, recovery, archiving, conversion strategies, and performance tuning; manages job scheduling, application release, cluster changes, and compliance. Identifies and resolves issues utilizing structured tools and techniques. Provides technical assistance and mentoring to staff in all aspects of Hadoop cluster management; consults and advises application development teams on security, query optimization, and performance. Writes scripts to automate routine cluster management tasks and documents maintenance processing flows per standards. Implement industry best practices while performing Hadoop cluster administration tasks. Works in an Agile model with a strong understanding of Agile concepts. Collaborates with development teams to provide and implement new features. Debugs production issues by analyzing logs directly and using tools like Splunk and Elastic. Address organizational obstacles to enhance processes and workflows. Adopts and learns new technologies based on demand and supports team members by coaching and assisting. Education: Bachelor’s degree in computer science, Information Systems, or another related field with 14+ years of IT and Infrastructure engineering work experience. Experience: 14+ Years Total IT experience & 10+ Years relevant experience in Big Data database Technical Skills: Big Data Platform Management : Big Data Platform Management: Expertise in managing and optimizing the Cloudera Data Platform, including components such as Apache Hadoop (YARN and HDFS), Apache HBase, Apache Solr , Apache Hive, Apache Kafka, Apache NiFi , Apache Ranger, Apache Spark, as well as JanusGraph and IBM BigSQL . Data Infrastructure & Security : Proficient in designing and implementing robust data infrastructure solutions with a strong focus on data security, utilizing tools like Apache Ranger and Kerberos. Performance Tuning & Optimization : Skilled in performance tuning and optimization of big data environments, leveraging advanced techniques to enhance system efficiency and reduce latency. Backup & Recovery : Experienced in developing and executing comprehensive backup and recovery strategies to safeguard critical data and ensure business continuity. Linux & Troubleshooting : Strong knowledge of Linux operating systems , with proven ability to troubleshoot and resolve complex technical issues, collaborating effectively with cross-functional teams. DevOps & Scripting : Proficient in scripting and automation using tools like Ansible, enabling seamless integration and automation of cluster operations. Experienced in infrastructure-as-code practices and observability tools such as Elastic. Agile & Collaboration : Strong understanding of Agile SAFe for Teams, with the ability to work effectively in Agile environments and collaborate with cross-functional teams. ITSM Process & Tools : Knowledgeable in ITSM processes and tools such as ServiceNow. Other Critical Requirements: Automation and Scripting : Proficiency in automation tools and programming languages such as Ansible and Python to streamline operations and improve efficiency. Analytical and Problem-Solving Skills : Strong analytical and problem-solving abilities to address complex technical challenges in a dynamic enterprise environment. 24x7 Support : Ability to work in a 24x7 rotational shift to support Hadoop platforms and ensure high availability. Team Management and Leadership : Proven experience managing geographically distributed and culturally diverse teams, with strong leadership, coaching, and mentoring skills. Communication Skills : Exceptional written and oral communication skills, with the ability to clearly articulate technical and functional issues, conclusions, and recommendations to stakeholders at all levels. Stakeholder Management : Prior experience in effectively managing both onshore and offshore stakeholders, ensuring alignment and collaboration across teams. Business Presentations : Skilled in creating and delivering impactful business presentations to communicate key insights and recommendations. Collaboration and Independence : Demonstrated ability to work independently as well as collaboratively within a team environment, ensuring successful project delivery in a complex enterprise setting. About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible . Join us! Show more Show less

Posted 19 hours ago

Apply

3.0 years

0 Lacs

Udupi, Karnataka, India

On-site

Linkedin logo

Blackfrog Technologies is a Manipal based technology startup that manufactures medical devices. We are ISO 13485 certified and have developed patented systems for improving immunization supply chains and now delivering efficacious vaccines to some of the farthest corners of India and beyond. Join us and be a part of this exciting & fulfilling journey! Responsibilities: Collaborate with cross-functional teams to define, design, and deliver new features. Develop responsive and dynamic front-end interfaces using Angular, ReactJS, HTML, CSS, and JavaScript. Build efficient and robust back-end systems with Node.js and Python. Design, query, and manage databases like MySQL and MongoDB to ensure data integrity and optimal performance. Develop and maintain Android/Flutter applications to integrate with existing platforms and enhance user experience. Work with MQTT protocol to enable seamless communication between devices in IoT ecosystems. Manage and deploy applications using AWS cloud services for high availability and scalability. Act as the primary point of contact with vendors to coordinate requirements, timelines, and deliverables. Must Have: 3+ years of strong software development background building complex applications. Proficiency with fundamental front end languages such as Angular, ReactJS, HTML, CSS, JavaScript and Electron js. Proficiency with server-side languages such as NODE JS and Python. Experience with database technology such as MySQL and MongoDB. Show more Show less

Posted 19 hours ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies