Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : Data Building Tool, Python (Programming Language)Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. You will create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Your day will involve working on data solutions and collaborating with teams to optimize data processes. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Develop and maintain data pipelines- Ensure data quality and integrity- Implement ETL processes Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse- Good To Have Skills: Experience with Data Building Tool- Strong understanding of data architecture- Proficiency in SQL and database management- Experience with cloud data platforms- Knowledge of data modeling Additional Information:- The candidate should have a minimum of 5 years of experience in Snowflake Data Warehouse- This position is based at our Bengaluru office- A 15 years full time education is required Qualification 15 years full time education
Posted 1 month ago
3.0 - 5.0 years
5 - 12 Lacs
Pune
Remote
Hiring for Analytics Engineers / Data Modelers with hands-on experience in Honeydew, CubeDev, or DBT Metrics to join our growing data team in Pune. Hands-on experience with at least one: Honeydew, CubeDev, or DBT Metrics. Strong SQL
Posted 1 month ago
2.0 - 7.0 years
4 - 8 Lacs
Ahmedabad
Work from Office
Travel Designer Group Founded in 1999, Travel Designer Group has consistently achieved remarkable milestones in a relatively short span of time. While we embody the agility, growth mindset, and entrepreneurial energy typical of start-ups, we bring with us over 24 years of deep-rooted expertise in the travel trade industry. As a leading global travel wholesaler, we serve as a vital bridge connecting hotels, travel service providers, and an expansive network of travel agents worldwide. Our core strength lies in sourcing, curating, and distributing high-quality travel inventory through our award-winning B2B reservation platform, RezLive.com. This enables travel trade professionals to access real-time availability and competitive pricing to meet the diverse needs of travelers globally. Our expanding portfolio includes innovative products such as: * Rez.Tez * Affiliate.Travel * Designer Voyages * Designer Indya * RezRewards * RezVault With a presence in over 32+ countries and a growing team of 300+ professionals, we continue to redefine travel distribution through technology, innovation, and a partner-first approach. Website : https://www.traveldesignergroup.com/ Profile :- ETL Developer ETL Tools - any 1 -Talend / Apache NiFi /Pentaho/ AWS Glue / Azure Data Factory /Google Dataflow Workflow & Orchestration : any 1 good to have- not mandatory Apache Airflow/dbt (Data Build Tool)/Luigi/Dagster / Prefect / Control-M Programming & Scripting : SQL (Advanced) Python ( mandatory ) Bash/Shell (mandatory) Java or Scala (optional for Spark) -optional Databases & Data Warehousing MySQL / PostgreSQL / SQL Server / Oracle mandatory Snowflake - good to have Amazon Redshift - good to have Google BigQuery - good to have Azure Synapse Analytics - good to have MongoDB / Cassandra - good to have Cloud & Data Storage : any 1 -2 AWS S3 / Azure Blob Storage / Google Cloud Storage - mandatory Kafka / Kinesis / Pub/Sub Interested candidate also share your resume in shivani.p@rezlive.com
Posted 1 month ago
2.0 - 4.0 years
13 - 14 Lacs
Chennai
Work from Office
. Responsible for working cross-functionally to collect data and develop models to determine trends utilizing a variety of data sources. Retrieves, analyzes and summarizes business, operations, employee, customer and/or economic data in order to develop business intelligence, optimize effectiveness, predict business outcomes and decision-making purposes. Involved with numerous key business decisions by conducting the analyses that inform our business strategy. This may include: impact measurement of new products or features via normalization techniques, optimization of business processes through robust A/B testing, clustering or segmentation of customers to identify opportunities of differentiated treatment, deep dive analyses to understand drivers of key business trends, identification of customer sentiment drivers through natural language processing (NLP) of verbatim responses to Net Promotor System (NPS) surveys and development of frameworks to drive upsell strategy for existing customers by balancing business priorities with customer activity. Works with moderate guidance in own area of knowledge. Job Description 1. 2-4 years of professional experience in software or data engineering roles. 2. Hands-on experience with Power BI, Power BI Desktop, Power Apps, and Power Automate. 3. Proficiency with Tableau and SharePoint. 4. Familiarity with Amazon Redshift and SAP integration and data extraction. 5. Strong analytical, troubleshooting, and communication skills. We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That's why we provide an array of options, expert guidance and always-on tools that are personalized to meet the needs of your reality to help support you physically, financially and emotionally through the big milestones and in your everyday life. Please visit the benefits summary on our careers site for more details. Education Bachelor's Degree While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience. Certifications (if applicable) Relevant Work Experience 2-5 Years
Posted 1 month ago
3.0 - 8.0 years
6 - 12 Lacs
Mumbai
Work from Office
Shift: (GMT+05:30) Asia/Kolkata (IST) What do you need for this opportunity? Must have skills required: Machine Learning, NumPy, Data Cleaning, Python, Model evaluation, pandas, Statistics We are seeking a talented Data Scientist II to join our team. The ideal candidate will have 2-5 years of experience in data science and possess expertise in machine learning, deep learning, Python programming, SQL, Amazon Redshift, NLP, and AWS Cloud. ** Duties and Responsibilities: Develop and implement machine learning models to extract insights from large datasets. Utilize deep learning techniques to enhance data analysis and predictive modeling. Write efficient Python code to manipulate and analyze data. - Work with SQL databases to extract and transform data for analysis. Utilize Amazon Redshift for data warehousing and analytics. Apply NLP techniques to extract valuable information from unstructured data. - Utilize AWS Cloud services for data storage, processing, and analysis. Qualifications and Requirements: Bachelor's degree in Computer Science, Statistics, Mathematics, or related field. - 3-5 years of experience in data science or related field. Proficiency in machine learning, deep learning, Python programming, SQL, Amazon Redshift, NLP, and AWS Cloud. Strong analytical and problem-solving skills. - Excellent communication and teamwork abilities. * *Key Competencies - Strong analytical skills. - Problem-solving abilities. - Proficiency in machine learning and deep learning techniques. Excellent programming skills in Python. - Knowledge of SQL and database management. - Familiarity with Amazon Redshift, NLP, and AWS Cloud services. ** Performance Expectations: Develop and deploy advanced machine learning models. Extract valuable insights from complex datasets. Collaborate with cross-functional teams to drive data-driven decision-making. Stay updated on the latest trends and technologies in data science. We are looking for a motivated and skilled Data Scientist I to join our team and contribute to our data-driven initiatives. If you meet the qualifications and are passionate about data science, we encourage you to apply.
Posted 1 month ago
6.0 - 9.0 years
14 - 16 Lacs
Chennai
Remote
SQL Engineer Expertise in SQL, including stored procedures Experience with Amazon Redshift Strong understanding of data warehousing concepts Minimum 6 years of experience
Posted 1 month ago
8.0 - 12.0 years
13 - 15 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
Role Description We are seeking a skilled and proactive Database Administrator (DBA) with strong SQL Development expertise to manage, optimize, and support our database systems. The ideal candidate will have hands-on experience with cloud-based and on-premises database platforms, with a strong emphasis on AWS RDS, PostgreSQL, Redshift, and SQL Server. A background in developing and optimizing complex SQL queries, stored procedures, and data workflows is essential. Experience: 8 Plus years Key Responsibilities: Design, implement, and maintain high-performance, scalable, and secure database systems on AWS RDS, PostgreSQL, Redshift, and SQL Server. Develop, review, and optimize complex SQL queries, views, stored procedures, triggers, and functions. Monitor database performance, implement tuning improvements, and ensure high availability and disaster recovery strategies.Collaborate with development and DevOps teams to support application requirements, schema changes, and release cycles. Perform database migrations, upgrades, and patch management. Create and maintain documentation related to database architecture, procedures, and best practices. Implement and maintain data security measures and access controls. Support ETL processes and troubleshoot data pipeline issues as needed. Mandatory Skills & Experience: Strong hands-on experience with: AWS RDS, PostgreSQL, Amazon Redshift, Microsoft SQL Server Skills Required: Proficiency in SQL development, including performance tuning and query optimization. Experience with backup strategies, replication, monitoring, and high-availability database configurations. Solid understanding of database design principles and best practices. Knowledge of SSIS, SSRS & SSAS development and its management. Knowledge of database partitioning, compression, online performance monitoring/tuning. Experience in database release management process and script review. Knowledge of Database mirroring, AAG and Disaster Recovery procedures. Knowledge in Database monitoring and different monitoring tools. Knowledge in data modeling, database optimization and relational database schemas Knowledge in writing complex SQL queries and debugging through someone elses code. Experience in managing internal and external MS SQL database security. Knowledge in Database Policies, Certificates, Database Mail, Resource Management Knowledge in SQL Server internals (Memory usage, DMVs, Threads, wait stats, Query Store, SQL Profiler) Knowledge of Cluster Server management and failovers. Knowledge of data modeling (SSARS), Reporting Services (SSRS, Tableau, PowerBI, Athena). Good to have Skills: Exposure to MySQL and Oracle databases. Familiarity with Azure Database Services (e.g., Azure SQL, Azure Database for PostgreSQL). Knowledge of scripting languages (e.g., Python, Bash) for automation is a plus. Experience working in Agile/Scrum environments. Knowledge of Java, PowerShell, or Python is preferred. Location : - Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote
Posted 1 month ago
1.0 - 5.0 years
4 - 8 Lacs
Mumbai
Work from Office
Piscis Networks is looking for TAC Support Engineer to join our dynamic team and embark on a rewarding career journey Responding to customer inquiries and resolving technical issues via phone, email, or chat Conducting diagnostic tests to identify the root cause of customer issuesProviding technical guidance to customers and walking them through solutions to resolve their problems Collaborating with development teams to escalate and resolve complex technical issues Maintaining accurate records of customer interactions and issue resolutions in a CRM system Participating in the development and delivery of customer training and support materials Communicating with customers and internal stakeholders to provide status updates on issue resolution Strong technical background and understanding of hardware and software systems Excellent communication and interpersonal skills Experience with CRM and ticketing systems
Posted 1 month ago
7.0 - 12.0 years
15 - 30 Lacs
Hyderabad
Hybrid
Job Title: Lead Data Engineer Job Summary The Lead Data Engineer will provide technical expertise in analysis, design, development, rollout and maintenance of data integration initiatives. This role will contribute to implementation methodologies and best practices, as well as work on project teams to analyse, design, develop and deploy business intelligence / data integration solutions to support a variety of customer needs. This position oversees a team of Data Integration Consultants at various levels, ensuring their success on projects, goals, trainings and initiatives though mentoring and coaching. Provides technical expertise in needs identification, data modelling, data movement and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective whilst leveraging best fit technologies (e.g., cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges Works with stakeholders to identify and define self-service analytic solutions, dashboards, actionable enterprise business intelligence reports and business intelligence best practices. Responsible for repeatable, lean and maintainable enterprise BI design across organizations. Effectively partners with client team. Leadership not only in the conventional sense, but also within a team we expect people to be leaders. Candidate should elicit leadership qualities such as Innovation, Critical thinking, optimism/positivity, Communication, Time Management, Collaboration, Problem-solving, Acting Independently, Knowledge sharing and Approachable. Responsibilities: Design, develop, test, and deploy data integration processes (batch or real-time) using tools such as Microsoft SSIS, Azure Data Factory, Databricks, Matillion, Airflow, Sqoop, etc. Create functional & technical documentation e.g. ETL architecture documentation, unit testing plans and results, data integration specifications, data testing plans, etc. Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs. Perform data analysis to validate data models and to confirm ability to meet business needs. May serve as project or DI lead, overseeing multiple consultants from various competencies Stays current with emerging and changing technologies to best recommend and implement beneficial technologies and approaches for Data Integration Ensures proper execution/creation of methodology, training, templates, resource plans and engagement review processes Coach team members to ensure understanding on projects and tasks, providing effective feedback (critical and positive) and promoting growth opportunities when appropriate. Coordinate and consult with the project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels Architect, design, develop and set direction for enterprise self-service analytic solutions, business intelligence reports, visualisations and best practice standards. Toolsets include but not limited to: SQL Server Analysis and Reporting Services, Microsoft Power BI, Tableau and Qlik. Work with report team to identify, design and implement a reporting user experience that is consistent and intuitive across environments, across report methods, defines security and meets usability and scalability best practices. Required Qualifications: 10 Years industry implementation experience with data integration tools such as AWS services Redshift, Athena, Lambda, Glue, S3, ETL, etc. 5-8 years of management experience required 5-8 years consulting experience preferred Minimum of 5 years of data architecture, data modelling or similar experience Bachelor’s degree or equivalent experience, Master’s Degree Preferred Strong data warehousing, OLTP systems, data integration and SDLC Strong experience in orchestration & working experience cloud native / 3rd party ETL data load orchestration Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.) Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP) Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms Strong databricks experience required to create notebooks in pyspark Experience using major data modelling tools (examples: ERwin, ER/Studio, PowerDesigner, etc.) Experience with major database platforms (e.g. SQL Server, Oracle, Azure Data Lake, Hadoop, Azure Synapse/SQL Data Warehouse, Snowflake, Redshift etc.) Strong experience in orchestration & working experience in either Data Factory or HDInsight or Data Pipeline or Cloud composer or Similar Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.) Understanding of modern data warehouse capabilities and technologies such as real-time, cloud, Big Data. Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP) Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms 3-5 years’ development experience in decision support / business intelligence environments utilizing tools such as SQL Server Analysis and Reporting Services, Microsoft’s Power BI, Tableau, looker etc. Preferred Skills & Experience: Knowledge and working experience with Data Integration processes, such as Data Warehousing, EAI, etc. Experience in providing estimates for the Data Integration projects including testing, documentation, and implementation Ability to Analyse business requirements as they relate to the data movement and transformation processes, research, evaluation and recommendation of alternative solutions. Ability to provide technical direction to other team members including contractors and employees. Ability to contribute to conceptual data modelling sessions to accurately define business processes, independently of data structures and then combines the two together. Proven experience leading team members, directly or indirectly, in completing high-quality major deliverables with superior results Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM. Can create documentation and presentations such that the they “stand on their own” Can advise sales on evaluation of Data Integration efforts for new or existing client work. Can contribute to internal/external Data Integration proof of concepts. Demonstrates ability to create new and innovative solutions to problems that have previously not been encountered. Ability to work independently on projects as well as collaborate effectively across teams Must excel in a fast-paced, agile environment where critical thinking and strong problem solving skills are required for success Strong team building, interpersonal, analytical, problem identification and resolution skills Experience working with multi-level business communities Can effectively utilise SQL and/or available BI tool to validate/elaborate business rules. Demonstrates an understanding of EDM architectures and applies this knowledge in collaborating with the team to design effective solutions to business problems/issues. Effectively influences and, at times, oversees business and data analysis activities to ensure sufficient understanding and quality of data. Demonstrates a complete understanding of and utilises DSC methodology documents to efficiently complete assigned roles and associated tasks. Deals effectively with all team members and builds strong working relationships/rapport with them. Understands and leverages a multi-layer semantic model to ensure scalability, durability, and supportability of the analytic solution. Understands modern data warehouse concepts (real-time, cloud, Big Data) and how to enable such capabilities from a reporting and analytic stand-point. Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM.
Posted 1 month ago
8.0 - 13.0 years
18 - 27 Lacs
Bengaluru
Work from Office
About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for a Data Architect with creativity and results-oriented critical thinking to meet complex challenges and develop new strategies for acquiring, analyzing, modeling and storing data. In this role you will guide the company into the future and utilize the latest technology and information management methodologies to meet our requirements for effective logical data modeling, metadata management and database warehouse domains. You will be working with experts in a variety of industries, including computer science and software development, as well as department heads and senior executives to integrate new technologies and refine system performance. We reward dedicated performance with exceptional pay and benefits, as well as tuition reimbursement and career growth opportunities. What You?ll Do Define data retention policies Monitor performance and advise any necessary infrastructure changes Mentor junior engineers and work with other architects to deliver best in class solutions Implement ETL / ELT process and orchestration of data flows Recommend and drive adoption of newer tools and techniques from the big data ecosystem Expertise You?ll Bring 10+ years in industry, building and managing big data systems Building, monitoring, and optimizing reliable and cost-efficient pipelines for SaaS is a must Building stream-processing systems, using solutions such as Storm or Spark-Streaming Dealing and integrating with data storage systems like SQL and NoSQL databases, file systems and object storage like s3 Reporting solutions like Pentaho, PowerBI, Looker including customizations Developing high concurrency, high performance applications that are database-intensive and have interactive, browser-based clients Working with SaaS based data management products will be an added advantage Proficiency and expertise in Cloudera / Hortonworks Spark HDF and NiFi RDBMS, NoSQL like Vertica, Redshift, Data Modelling with physical design and SQL performance optimization Messaging systems, JMS, Active MQ, Rabbit MQ, Kafka Big Data technology like Hadoop, Spark, NoSQL based data-warehousing solutions Data warehousing, reporting including customization, Hadoop, Spark, Kafka, Core java, Spring/IOC, Design patterns Big Data querying tools, such as Pig, Hive, and Impala Open-source technologies and databases (SQL & NoSQL) Proficient understanding of distributed computing principles Ability to solve any ongoing issues with operating the cluster Scale data pipelines using open-source components and AWS services Cloud (AWS), provisioning, capacity planning and performance analysis at various levels Web-based SOA architecture implementation with design pattern experience will be an added advantage Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage : group term life , personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential. See Beyond, Rise Above.
Posted 1 month ago
8.0 - 13.0 years
18 - 30 Lacs
Pune
Work from Office
About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for a Data Architect with creativity and results-oriented critical thinking to meet complex challenges and develop new strategies for acquiring, analyzing, modeling and storing data. In this role you will guide the company into the future and utilize the latest technology and information management methodologies to meet our requirements for effective logical data modeling, metadata management and database warehouse domains. You will be working with experts in a variety of industries, including computer science and software development, as well as department heads and senior executives to integrate new technologies and refine system performance. We reward dedicated performance with exceptional pay and benefits, as well as tuition reimbursement and career growth opportunities. What You?ll Do Define data retention policies Monitor performance and advise any necessary infrastructure changes Mentor junior engineers and work with other architects to deliver best in class solutions Implement ETL / ELT process and orchestration of data flows Recommend and drive adoption of newer tools and techniques from the big data ecosystem Expertise You?ll Bring 10+ years in industry, building and managing big data systems Building, monitoring, and optimizing reliable and cost-efficient pipelines for SaaS is a must Building stream-processing systems, using solutions such as Storm or Spark-Streaming Dealing and integrating with data storage systems like SQL and NoSQL databases, file systems and object storage like s3 Reporting solutions like Pentaho, PowerBI, Looker including customizations Developing high concurrency, high performance applications that are database-intensive and have interactive, browser-based clients Working with SaaS based data management products will be an added advantage Proficiency and expertise in Cloudera / Hortonworks Spark HDF and NiFi RDBMS, NoSQL like Vertica, Redshift, Data Modelling with physical design and SQL performance optimization Messaging systems, JMS, Active MQ, Rabbit MQ, Kafka Big Data technology like Hadoop, Spark, NoSQL based data-warehousing solutions Data warehousing, reporting including customization, Hadoop, Spark, Kafka, Core java, Spring/IOC, Design patterns Big Data querying tools, such as Pig, Hive, and Impala Open-source technologies and databases (SQL & NoSQL) Proficient understanding of distributed computing principles Ability to solve any ongoing issues with operating the cluster Scale data pipelines using open-source components and AWS services Cloud (AWS), provisioning, capacity planning and performance analysis at various levels Web-based SOA architecture implementation with design pattern experience will be an added advantage Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage : group term life , personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential. See Beyond, Rise Above.
Posted 1 month ago
8.0 - 13.0 years
18 - 25 Lacs
Hyderabad
Work from Office
About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for a Data Architect with creativity and results-oriented critical thinking to meet complex challenges and develop new strategies for acquiring, analyzing, modeling and storing data. In this role you will guide the company into the future and utilize the latest technology and information management methodologies to meet our requirements for effective logical data modeling, metadata management and database warehouse domains. You will be working with experts in a variety of industries, including computer science and software development, as well as department heads and senior executives to integrate new technologies and refine system performance. We reward dedicated performance with exceptional pay and benefits, as well as tuition reimbursement and career growth opportunities. What You?ll Do Define data retention policies Monitor performance and advise any necessary infrastructure changes Mentor junior engineers and work with other architects to deliver best in class solutions Implement ETL / ELT process and orchestration of data flows Recommend and drive adoption of newer tools and techniques from the big data ecosystem Expertise You?ll Bring 10+ years in industry, building and managing big data systems Building, monitoring, and optimizing reliable and cost-efficient pipelines for SaaS is a must Building stream-processing systems, using solutions such as Storm or Spark-Streaming Dealing and integrating with data storage systems like SQL and NoSQL databases, file systems and object storage like s3 Reporting solutions like Pentaho, PowerBI, Looker including customizations Developing high concurrency, high performance applications that are database-intensive and have interactive, browser-based clients Working with SaaS based data management products will be an added advantage Proficiency and expertise in Cloudera / Hortonworks Spark HDF and NiFi RDBMS, NoSQL like Vertica, Redshift, Data Modelling with physical design and SQL performance optimization Messaging systems, JMS, Active MQ, Rabbit MQ, Kafka Big Data technology like Hadoop, Spark, NoSQL based data-warehousing solutions Data warehousing, reporting including customization, Hadoop, Spark, Kafka, Core java, Spring/IOC, Design patterns Big Data querying tools, such as Pig, Hive, and Impala Open-source technologies and databases (SQL & NoSQL) Proficient understanding of distributed computing principles Ability to solve any ongoing issues with operating the cluster Scale data pipelines using open-source components and AWS services Cloud (AWS), provisioning, capacity planning and performance analysis at various levels Web-based SOA architecture implementation with design pattern experience will be an added advantage Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage : group term life , personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential. See Beyond, Rise Above.
Posted 1 month ago
12.0 - 15.0 years
16 - 18 Lacs
Bengaluru
Hybrid
iSource Services is hiring for one of their client for the position of AWS. AWS experience (not Azure or GCP), with 12-15 years of experience, and hands-on expertise in design and implementation. Design and Develop Data Solutions, Design and implement efficient data processing pipelines using AWS services like AWS Glue, AWS Lambda, Amazon S3, and Amazon Redshift. Candidates should possess exceptional communication skills to engage effectively with US clients. The ideal candidate must be hands-on with significant practical experience. Availability to work overlapping US hours is essential. The contract duration is 6 months. For this role, we're looking for candidates with 12 to 15 years of experience. AWS experience communication skills
Posted 1 month ago
2.0 - 6.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Flexing It is a freelance consulting marketplace that connects freelancers and independent consultants with organisations seeking independent talent.. Flexing It has partnered with Our client, a global leader in energy management and automation, is seeking a Data engineer to prepare data and make it available in an efficient and optimized format for their different data consumers, ranging from BI and analytics to data science applications. It requires to work with current technologies in particular Apache Spark, Lambda & Step Functions, Glue Data Catalog, and RedShift on AWS environment.. Key Responsibilities:. Design and develop new data ingestion patterns into IntelDS Raw and/or Unified data layers based on the requirements and needs for connecting new data sources or for building new data objects. Working in ingestion patterns allow to automate the data pipelines.. Participate to and apply DevSecOps practices by automating the integration and delivery of data pipelines in a cloud environment. This can include the design and implementation of end-to-end data integration tests and/or CICD pipelines.. Analyze existing data models, identify and implement performance optimizations for data. ingestion and data consumption. The objective is to accelerate data availability within the. platform and to consumer applications.. Support client applications in connecting and consuming data from the platform, and ensure they follow our guidelines and best practices.. Participate in the monitoring of the platform and debugging of detected issues and bugs. Skills required:. Minimum of 3 years prior experience as data engineer with proven experience on Big Data and Data Lakes on a cloud environment.. Bachelor or Master degree in computer science or applied mathematics (or equivalent). Proven experience working with data pipelines / ETL / BI regardless of the technology.. Proven experience working with AWS including at least 3 of: RedShift, S3, EMR, Cloud. Formation, DynamoDB, RDS, lambda.. Big Data technologies and distributed systems: one of Spark, Presto or Hive.. Python language: scripting and object oriented.. Fluency in SQL for data warehousing (RedShift in particular is a plus).. Good understanding on data warehousing and Data modelling concepts. Familiar with GIT, Linux, CI/CD pipelines is a plus.. Strong systems/process orientation with demonstrated analytical thinking, organization. skills and problem-solving skills.. Ability to self-manage, prioritize and execute tasks in a demanding environment.. Strong consultancy orientation and experience, with the ability to form collaborative,. productive working relationships across diverse teams and cultures is a must.. Willingness and ability to train and teach others.. Ability to facilitate meetings and follow up with resulting action items. Show more Show less
Posted 1 month ago
1.0 - 3.0 years
1 - 5 Lacs
Chennai
Work from Office
hackajob is collaborating with Comcast to connect them with exceptional tech professionals for this role.. Comcast brings together the best in media and technology. We drive innovation to create the world's best entertainment and online experiences. As a Fortune 50 leader, we set the pace in a variety of innovative and fascinating businesses and create career opportunities across a wide range of locations and disciplines. We are at the forefront of change and move at an amazing pace, thanks to our remarkable people, who bring cutting-edge products and services to life for millions of customers every day. If you share in our passion for teamwork, our vision to revolutionize industries and our goal to lead the future in media and technology, we want you to fast-forward your career at Comcast.. Job Summary. We are looking for an experienced and proactive ETL Lead to oversee and guide our ETL testing and data validation efforts. This role requires a deep understanding of ETL processes, strong technical expertise in tools such as SQL, Oracle, MongoDB, AWS, and Python/Pyspark, and proven leadership capabilities. The ETL Lead will be responsible for ensuring the quality, accuracy, and performance of our data pipelines while mentoring a team of testers and collaborating with cross-functional stakeholders.. Job Description. Key Responsibilities:. Lead the planning, design, and execution of ETL testing strategies across multiple projects.. Oversee the development and maintenance of test plans, test cases, and test data for ETL processes.. Ensure data integrity, consistency, and accuracy across all data sources and destinations.. Collaborate with data engineers, developers, business analysts, and project managers to define ETL requirements and testing scope.. Mentor and guide a team of ETL testers, providing technical direction and support.. Review and approve test deliverables and ensure adherence to best practices and quality standards.. Identify and resolve complex data issues, bottlenecks, and performance challenges.. Drive continuous improvement in ETL testing processes, tools, and methodologies.. Provide regular status updates, test metrics, and risk assessments to stakeholders.. Stay current with emerging trends and technologies in data engineering and ETL testing.. Requirements. 6+ years of experience in ETL testing, with at least 2 years in a lead or senior role.. Strong expertise in ETL concepts, data warehousing, and data validation techniques.. Hands-on experience with Oracle, MongoDB, AWS services (e.g., S3, Redshift, Glue), and Python/Pyspark scripting.. Advanced proficiency in SQL and other query languages.. Proven ability to lead and mentor a team of testers.. Excellent problem-solving, analytical, and debugging skills.. Strong communication and stakeholder management abilities.. Experience with Agile/Scrum methodologies is a plus.. Ability to manage multiple priorities and deliver high-quality results under tight deadlines.. Disclaimer. This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications.. Comcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law.. Base pay is one part of the Total Rewards that Comcast provides to compensate and recognize employees for their work. Most sales positions are eligible for a Commission under the terms of an applicable plan, while most non-sales positions are eligible for a Bonus. Additionally, Comcast provides best-in-class Benefits to eligible employees. We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That’s why we provide an array of options, expert guidance and always-on tools, that are personalized to meet the needs of your reality to help support you physically, financially and emotionally through the big milestones and in your everyday life.. Education. Bachelor's Degree. While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience.. Relevant Work Experience. 7-10 Years. Show more Show less
Posted 1 month ago
5.0 - 10.0 years
12 - 17 Lacs
Pune
Work from Office
What You'll Do. The Global Analytics and Insights (GAI) team is seeking an experienced and experienced Data Visualization Manager to lead our data-driven decision-making initiatives.. The ideal candidate will have a background in Power BI, expert-level SQL proficiency, to drive actionable insights and demonstrated leadership and mentoring experience, and an ability to drive innovation and manage complex projects.. You will become an expert in Avalara's financial, marketing, sales, and operations data.. This position will Report to Senior Manager. What Your Responsibilities Will Be. You will define and execute the organization's BI strategy, ensuring alignment with business goals.. You will Lead, mentor, and manage a team of BI developers and analysts, fostering a continuous learning.. You will Develop and implement robust data visualization and reporting solutions using Power BI.. You will Optimize data models, dashboards, and reports to provide meaningful insights and support decision-making.. You will Collaborate with business leaders, analysts, and cross-functional teams to gather and translate requirements into actionable BI solutions.. Be a trusted advisor to business teams, identifying opportunities where BI can drive efficiencies and improvements.. You will Ensure data accuracy, consistency, and integrity across multiple data sources.. You will Stay updated with the latest advancements in BI tools, SQL performance tuning, and data visualization best practices.. You will Define and enforce BI development standards, governance, and documentation best practices.. You will work closely with Data Engineering teams to define and maintain scalable data pipelines.. You will Drive automation and optimization of reporting processes to improve efficiency.. What You’ll Need To Be Successful. 8+ years of experience in Business Intelligence, Data Analytics, or related fields.. 5+ Expert proficiency in Power BI, including DAX, Power Query, data modeling, and dashboard creation.. 5+ years of strong SQL skills, with experience in writing complex queries, performance tuning, and working with large datasets.. Familiarity with cloud-based BI solutions (e.g., Azure Synapse, AWS Redshift, Snowflake) is a plus.. Should have understanding of ETL processes and data warehousing concepts.. Strong problem-solving, analytical thinking, and decision-making skills.. How We’ll Take Care Of You. Total Rewards. In addition to a great compensation package, paid time off, and paid parental leave, many Avalara employees are eligible for bonuses.. Health & Wellness. Benefits vary by location but generally include private medical, life, and disability insurance.. Inclusive culture and diversity. Avalara strongly supports diversity, equity, and inclusion, and is committed to integrating them into our business practices and our organizational culture. We also have a total of 8 employee-run resource groups, each with senior leadership and exec sponsorship.. Learn more about our benefits by region here: Avalara North America. What You Need To Know About Avalara. We’re Avalara. We’re defining the relationship between tax and tech.. We’ve already built an industry-leading cloud compliance platform, processing nearly 40 billion customer API calls and over 5 million tax returns a year, and this year we became a billion-dollar business. Our growth is real, and we’re not slowing down until we’ve achieved our mission to be part of every transaction in the world.. We’re bright, innovative, and disruptive, like the orange we love to wear. It captures our quirky spirit and optimistic mindset. It shows off the culture we’ve designed, that empowers our people to win. Ownership and achievement go hand in hand here. We instill passion in our people through the trust we place in them.. We’ve been different from day one. Join us, and your career will be too.. We’re An Equal Opportunity Employer. Supporting diversity and inclusion is a cornerstone of our company — we don’t want people to fit into our culture, but to enrich it. All qualified candidates will receive consideration for employment without regard to race, color, creed, religion, age, gender, national orientation, disability, sexual orientation, US Veteran status, or any other factor protected by law. If you require any reasonable adjustments during the recruitment process, please let us know.. Show more Show less
Posted 1 month ago
4.0 - 8.0 years
9 - 13 Lacs
Bengaluru
Work from Office
???? We’re Hiring: Delivery Solution Architect – Data Analytics (AWS) ????. ???? Location: Remote. ???? Level: Mid to Senior (3–5+ years experience). Are you passionate about turning complex data challenges into elegant, scalable solutions on the AWS cloud? We're looking for a Delivery Solution Architect – Data Analytics to join our growing team and take the lead in architecting and delivering next-gen data platforms that drive real business impact.. ???? About the Role:. As a Delivery Solution Architect, you will play a pivotal role in designing and implementing end-to-end data analytics solutions on AWS. You’ll collaborate with cross-functional teams and lead a group of 5–10 consultants and engineers to bring modern data architectures to life—powering business intelligence, machine learning, and operational insights.. ????? Key Responsibilities:. Lead the design and delivery of data analytics solutions using AWS services (Redshift, EMR, Glue, Kinesis, etc.). Collaborate with project teams, clients, and sales stakeholders to craft technical proposals and solution blueprints. Design scalable, secure, high-performance data models and ETL pipelines. Optimize data platforms for cost-efficiency, query performance, and concurrency. Ensure data governance, security, and compliance with best practices. Troubleshoot technical issues and provide mentorship to engineering teams. Stay ahead of industry trends and bring innovative solutions to the table. Report to practice leads, contribute to documentation, and support deployment activities. ???? Qualifications:. 3–5 years of experience as a Solution Architect or Technical Lead in data analytics delivery. Hands-on expertise with AWS data tools (Redshift, EMR, Glue, Kinesis, etc.). Proficiency in SQL and Python; strong data modeling and ETL experience. Knowledge of Microsoft Azure Data Analytics tools is a plus. Experience working in Agile teams and using version control (e.g., Git). Strong communication skills and ability to collaborate with technical and non-technical stakeholders. AWS Certifications (Solutions Architect & Data Analytics – Specialty) are required. ???? Preferred Skills:. Team leadership in project delivery environments. Familiarity with data governance, data quality, and metadata management. Documentation, proposal writing, and client engagement skills. ???? What’s In It For You?. Opportunity to work with advanced AWS data technologies. Be part of a collaborative, innovation-focused team. Shape data strategies that directly impact enterprise decision-making. Career growth in a cloud-first, analytics-driven environment. ???? Ready to architect the future of data? Apply now or reach out to learn more!. #AWSJobs #DataAnalytics #SolutionArchitect #Hiring #AWSCareers #CloudComputing #DataEngineering #Redshift #Glue #AzureData #TechJobs. Show more Show less
Posted 1 month ago
8.0 - 13.0 years
25 - 30 Lacs
Mangaluru
Work from Office
Job Summary. As a DevOps Engineer specializing in data, you will be responsible for implementing and managing our cloud-based data infrastructure using AWS and Snowflake. You will collaborate with data engineers, data scientists, and other stakeholders to design, deploy, and maintain a robust data ecosystem that supports our analytics and business intelligence initiatives. Your expertise in modern data tech stacks, MLOps methodologies, automation, and information security will be crucial in enhancing our data pipelines and ensuring data integrity and availability.. Technical Skills. 3+ years of experience in the field of Data Warehousing and BI. Experience working with Snowflake Database. In depth knowledge of Data Warehouse concepts.. Experience in designing, developing, testing, and implementing ETL solutions using enterprise ETL tools .. Experience with large or partitioned relational databases (Aurora / MySQL / DB2). Very strong SQL and data analysis capabilities. Familiarly with Billing and Payment data is a plus. Agile development (Scrum) experience. Other preferred experience includes working with DevOps practices, SaaS, IaaS, code management (CodeCommit, git), deployment tools (CodeBuild, CodeDeploy, Jenkins, Shell scripting), and Continuous Delivery. Primary AWS development skills include S3, IAM, Lambda, RDS, Kinesis, APIGateway, Redshift, EMR, Glue, and CloudFormation. Responsibilities. Be a key contributor to the design and development of a scalable and cost-effective cloud-based data platform based on a data lake design. Develop data platform components in a cloud environment to ingest data and events from cloud and on-premises environments as well as third parties. Build automated pipelines and data services to validate, catalog, aggregate and transform ingested data. Build automated data delivery pipelines and services to integrate data from the data lake to internal and external consuming applications and services. Build and deliver cloud-based deployment and monitoring capabilities consistent with DevOps models. Deep knowledge and skills current with the latest cloud services, features and best practices. Who We Are:. unifyCX is an emerging Global Business Process Outsourcing company with a strong presence in the U.S., Colombia, Dominican Republic, India, Jamaica, Honduras, and the Philippines. We provide personalized contact centers, business processing, and technology outsourcing solutions to clients worldwide. In nearly two decades, unifyCX has grown from a small team to a global organization with staff members all over the world dedicated to supporting our international clientele.. At unifyCX, we leverage advanced AI technologies to elevate the customer experience (CX) and drive operational efficiency for our clients. Our commitment to innovation positions us as a trusted partner, enabling businesses across industries to meet the evolving demands of a global market with agility and precision.. unifyCX is a certified minority-owned business and an EOE employer who welcomes diversity.. Show more Show less
Posted 1 month ago
3.0 - 7.0 years
8 - 12 Lacs
Mangaluru
Work from Office
Job Summary. As a DevOps Engineer specializing in data, you will be responsible for implementing and managing our cloud-based data infrastructure using AWS and Snowflake. You will collaborate with data engineers, data scientists, and other stakeholders to design, deploy, and maintain a robust data ecosystem that supports our analytics and business intelligence initiatives. Your expertise in modern data tech stacks, MLOps methodologies, automation, and information security will be crucial in enhancing our data pipelines and ensuring data integrity and availability.. Technical Skills. 3+ years of experience in the field of Data Warehousing and BI. Experience working with Snowflake Database. In depth knowledge of Data Warehouse concepts.. Experience in designing, developing, testing, and implementing ETL solutions using enterprise ETL tools .. Experience with large or partitioned relational databases (Aurora / MySQL / DB2). Very strong SQL and data analysis capabilities. Familiarly with Billing and Payment data is a plus. Agile development (Scrum) experience. Other preferred experience includes working with DevOps practices, SaaS, IaaS, code management (CodeCommit, git), deployment tools (CodeBuild, CodeDeploy, Jenkins, Shell scripting), and Continuous Delivery. Primary AWS development skills include S3, IAM, Lambda, RDS, Kinesis, APIGateway, Redshift, EMR, Glue, and CloudFormation. Responsibilities. Be a key contributor to the design and development of a scalable and cost-effective cloud-based data platform based on a data lake design. Develop data platform components in a cloud environment to ingest data and events from cloud and on-premises environments as well as third parties. Build automated pipelines and data services to validate, catalog, aggregate and transform ingested data. Build automated data delivery pipelines and services to integrate data from the data lake to internal and external consuming applications and services. Build and deliver cloud-based deployment and monitoring capabilities consistent with DevOps models. Deep knowledge and skills current with the latest cloud services, features and best practices. Who We Are:. unifyCX is an emerging Global Business Process Outsourcing company with a strong presence in the U.S., Colombia, Dominican Republic, India, Jamaica, Honduras, and the Philippines. We provide personalized contact centers, business processing, and technology outsourcing solutions to clients worldwide. In nearly two decades, unifyCX has grown from a small team to a global organization with staff members all over the world dedicated to supporting our international clientele.. At unifyCX, we leverage advanced AI technologies to elevate the customer experience (CX) and drive operational efficiency for our clients. Our commitment to innovation positions us as a trusted partner, enabling businesses across industries to meet the evolving demands of a global market with agility and precision.. unifyCX is a certified minority-owned business and an EOE employer who welcomes diversity.. Show more Show less
Posted 1 month ago
8.0 - 13.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Experience: 3+ Years. As a Senior Data Engineer, you’ll build robust data pipelines and enable data-driven decisions by developing scalable solutions for analytics and reporting. Perfect for someone with strong database and ETL expertise.. Job Responsibilities-. Design, build, and maintain scalable data pipelines and ETL processes.. Work with large data sets from diverse sources.. Develop and optimize data models, warehouses, and integrations.. Collaborate with data scientists, analysts, and product teams.. Ensure data quality, security, and compliance standards.. Qualifications-. Proficiency in SQL, Python, and data pipeline tools (Airflow, Spark).. Experience with data warehouses (Redshift, Snowflake, BigQuery).. Knowledge of cloud platforms (AWS/GCP/Azure).. Strong problem-solving and analytical skills.. Show more Show less
Posted 1 month ago
1.0 - 5.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Scripbox is India’s largest and best-established digital wealth management service that helps its customers create wealth for their long-and-short-term goals.. Founded in 2012, Scripbox is a pioneer in the digital financial services category and is recognised for creating simple and elegant user experiences in a complex domain. We do this by simplifying complex investing concepts and automating best practices, so our customers can grow their wealth without worry. We achieve this by combining cutting-edge technology, data-driven algorithms, awesome UX and friendly customer support.. Our task is ambitious and we like to work hard as well as smart. We want to build a team that relishes challenges and contributes to a new way of thinking and investing in India. We are also invested in the growth of our colleagues and providing a supportive and thriving working environment for everyone. We have been recognised by Great Place To Work® as one of India’s best companies to work for.. We are looking for creators who can build products that our customers love. The challenge for you will involve understanding, and building for, an unforgiving consumer who invests a lot of trust into the product YOU will build. Your product will be used by thousands.. Scripbox is making a difference in the world of personal finance and investing and we would like you to be part of the team that makes it happen.. Responsibilities:. Job Description. Develop high-quality code using established language best practices.. Collaborate closely within a team environment.. Utilize the latest tools and techniques to build robust software.. Actively participate in design reviews, code development, code reviews, and unit testing.. Take ownership of the quality and usability of your code contributions.. Requirements:. 4-5 years of experience building good quality production software. Excellent knowledge of at least one ecosystem based on Ruby, Elixir, Java or Python. Proficiency in object-oriented programming, including a solid understanding of design patterns. Experience with functional programming is preferred but not required. Familiar with datastores like MySQL, PostgreSQL, Redis, Redshift etc.. Familiarity with react.js/react-native, vue.js, bootstrap etc.. Knowledge of deploying software to AWS, GCP, Azure. Knowledge of software best practices, like Test-Driven Development (TDD) and Continuous Integration (CI). We Value:. Entrepreneurial spirit. Everywhere you go, you can’t help but mobilize people, build things, solve problems, roll up your sleeves, go above and beyond, raise the bar. You are an insatiable doer and driver. Strong execution and organization. Your team will be working with engineers and product leads at the bleeding edge of the development cycle. To be successful in this role, you should be comfortable executing with little oversight and be able to adapt to problems quickly. Strategic mindset you’re comfortable thinking a few steps ahead of where the team is at now. What You’ll Get:. Very competitive salary with performance bonus. Active promotion of your professional career by sending you to events, hackathons, user groups etc.. Weekly time-slot where you are encouraged to spend time to play around with new technology or self-learning. Skills. RubyonRails. Elixir. Backend. ruby. Skills. rubyonrails. elixir. backend. Show more Show less
Posted 1 month ago
5.0 - 9.0 years
8 - 13 Lacs
Chennai
Work from Office
Career Area: Technology, Digital and Data : Your Work Shapes the World at Caterpillar Inc. When you join Caterpillar, you'rejoining a global team who cares not just about the work we do but also about each other. We are the makers, problem solvers, and future world builders who are creating stronger, more sustainable communities. We don'tjust talk about progress and innovation here we make it happen, with our customers, where we work and live. Together, we are building a better world, so we can all enjoy living in it. JOB DUTIES: As a Software Engineer you will contribute to development and deployment of Caterpillars state-of-the-art digital platform. Competent to perform all programming and development assignments without close supervision; normally assigned the more complex aspects of systems work. Works directly on complex application/technical problem identification and resolution, including responding to off-shift and weekend support calls. Works independently on complex systems or infrastructure components that may be used by one or more applications or systems. Drives application development focused around delivering business valuable features Maintains high standards of software quality within the team by establishing good practices and habits. Identifies and encourage areas for growth and improvement within the team. Mentors junior developers. Communicate with end users and internal customers to help direct development, debugging, and testing of application software for accuracy, integrity, interoperability, and completeness. Performs integrated testing and customer acceptance testing of components that requires careful planning and execution to ensure timely, quality results. The position manages the completion of its own work assignments and coordinates work with others. Based on past experiences and knowledge, the incumbent normally works independently with minimal management input and review of end results. Typical customers include Caterpillar customers, dealers, other external companies who purchase services offered by Caterpillar as well as internal business unit and/or service centre groups. The position is challenged to quickly and correctly identify problems that may not be obvious. The incumbent solves problems by determining the best course of action, within departmental guidelines, from many existing solutions. The incumbent sets priorities and establishes a work plan in order to complete broadly defined assignments and achieve desired results. The position participates in brainstorming sessions focused on developing new approaches to meeting quality goals in the measure(s) stated. Basic qualifications: Position requires a four-year degree from an accredited college or university. 7+ years or more of software development experience or at least 2 years of experience with masters degree in computer science or related field. 7+ years or more of experience in designing and developing Data Pipelines in Python Top candidates will also have: Proven experience in many of the following, Designing, developing, deploying and maintaining software at scale, with good understanding of concurrency. In-Depth understanding of processing Data pipelines Expertise and experience in building Large Data Lakes and data warehouses such as Snowflake(Preferred) or Redshift or Synapse. Expertise in SQL and NO-SQL databases Extensive Data Warehousing Experience is needed At least 2+ plus years of deploying and maintaining software using public clouds such as AWS or Azure. Working within an Agile framework (ideally Scrum). Strong analytical skills. Must demonstrate solid knowledge of computer science fundamentals like data structures and algorithms and object-oriented design. Ability to work under pressure and within time constraints. Passion for technology and an eagerness to contribute to a team-oriented environment. Demonstrated leadership on medium to large-scale projects impacting strategic priorities. Bachelors degree in Computer science or Electrical engineering or related field is required. Caterpillar is an Equal Opportunity Employer (EEO). Posting Dates: June 17, 2025 - June 23, 2025 Caterpillar is an Equal Opportunity Employer. Not ready to applyJoin our Talent Community.
Posted 1 month ago
4.0 - 7.0 years
7 - 12 Lacs
Noida, Bengaluru
Work from Office
: Paytm is India's leading mobile payments and financial services distribution company. Pioneer of the mobile QR payments revolution in India, Paytm builds technologies that help small businesses with payments and commerce. Paytm’s mission is to serve half a billion Indians and bring them to the mainstream economy with the help of technology. About the role Your responsibility as a MYSQL database administrator (DBA) will be the performance, integrity and security of a database. You'll be involved in the planning and development of the database, as well as in troubleshooting any issues on behalf of the users. Requirement : - 3 to 6 Years Experience - MySQL, AWS RDS, AWS AURORA working knowledge is Must - Replication - AWS Admin - User Management - Machine Creation (Manual or by Terraform) - AMI creation - Backup and restoration Why join us A collaborative output driven program that brings cohesiveness across businesses through technology Improve the average revenue per use by increasing the cross-sell opportunities A solid 360 feedback from your peer teams on your support of their goals Compensation: If you are the right fit, we believe in creating wealth for you with enviable 500 mn+ registered users, 21 mn+ merchants and depth of data in our ecosystem, we are in a unique position to democratize credit for deserving consumers & merchants – and we are committed to it. India’s largest digital lending story is brewing here. It’s your opportunity to be a part of the story! Location - Noida, Uttar Pradesh,Bangalore, Karnataka
Posted 1 month ago
4.0 - 9.0 years
11 - 14 Lacs
Hyderabad
Work from Office
Overview iCIMS is seeking a Financial Analyst to join our FP&A team, based in India and reporting directly to the Vice President of FP&A. This highly visible role will focus on project-based financial analysis across revenue, expense, and operational/corporate reporting, serving as a force multiplier for the broader FP&A team by creating scalable solutions, driving automation, ad-hoc project work and enhancing reporting efficiency. The ideal candidate brings a strong blend of financial acumen and technical expertise, thrives on solving complex problems, and is passionate about enabling others through better data and tools. This role will work partially overlapping with U.S. business hours to collaborate effectively with global stakeholders. Responsibilities Act as a technical enabler for the FP&A team by developing tools, templates, and scalable processes to enhance reporting, analytic, forecasting and planning activities Integrate data from multiple systems (Tableau, Salesforce, NetSuite, Adaptive Insights, Excel) into consolidated reports and dashboards Build and maintain automated reporting solutions using Excel (including VBA/macros), Tableau, Adaptive and other reporting tools to streamline workflows and improve data accessibility Lead and deliver financial projects spanning revenue, expense, and operational/corporate reporting, ensuring solutions align with business priorities Identify and implement process improvements to eliminate manual work, accelerate reporting timelines and reduce errors Collaborate closely with finance, accounting, and operational stakeholders to understand reporting needs and proactively develop solutions Support monthly close, forecast, and long-range planning processes through development of reusable reporting models and automation Translate complex financial and operational data into actionable insights for executive leadership Maintain high standards of data accuracy, process consistency, and documentation across reporting deliverables Qualifications 4+ years of experience in FP&A, business analysis, financial systems, or a related analytical role Expert-level Excel skills, including advanced formulas, data modeling, and VBA/macro development Proficiency with data visualization tools (Tableau, Power BI, or similar) Strong analytical and problem-solving abilities with a continuous improvement mindset Experience working with financial systems such as NetSuite, Adaptive Insights, Salesforce, or similar platforms Excellent communication skills, with the ability to translate complex data into clear insights for diverse stakeholders Proven ability to manage multiple complex deliverables simultaneously Proven ability to work independently while collaborating across global teams Ability and willingness to work overlapping U.S. business hours as required for collaboration Education/Certifications/Licenses: Bachelor’s degree in Finance, Accounting, Business Analytics, Computer Science, Information Systems, or a related field
Posted 1 month ago
3.0 - 6.0 years
20 - 25 Lacs
Bengaluru
Hybrid
Join us as a Data Engineer II in Bengaluru! Build scalable data pipelines using Python, SQL, AWS, Airflow, and Kafka. Drive real-time & batch data systems across analytics, ML, and product teams. A hybrid work option is available. Required Candidate profile 3+ yrs in data engineering with strong Python, SQL, AWS, Airflow, Spark, Kafka, Debezium, Redshift, ETL & CDC experience. Must know data lakes, warehousing, and orchestration tools.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough