Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 6.0 years
0 Lacs
kochi, kerala
On-site
As an experienced ETL Developer (Senior level) with 5 years of experience, you will be joining our team immediately. Your role will require strong technical expertise, domain knowledge, and excellent stakeholder management skills. Key Responsibilities: - Develop and maintain ETL processes - Design and implement Data Warehousing solutions - Utilize advanced SQL and PL/SQL for data manipulation - Work with Snowflake platform for data storage and processing - Apply Pharma/Healthcare domain expertise to optimize data workflows - Engage in strong stakeholder management and communication - Collaborate closely with leadership, technical, and business teams Qualifications Required: - Minimum of 5 years of experience in ETL - At least 2 years of experience in the Healthcare domain Good to Have: - Knowledge of Veeva-CRM data In addition to the above details, the job is full-time and the work location is in person.,
Posted 3 days ago
2.0 - 5.0 years
1 - 5 Lacs
thane, navi mumbai, mumbai (all areas)
Work from Office
Job Description: .NET Developer (3-5 Years of Experience) Position Title: .NET Developer Experience Required: 3-5 Years Location: Thane, Maharashtra (On-Site) Employment Type: Full-time Role Overview We are seeking a skilled .NET Developer with 3-5 years of experience to join our dynamic team. The ideal candidate should have a solid background in .NET development, a strong grasp of advanced SQL, and expertise in writing and optimizing stored procedures. This role involves designing, developing, and maintaining robust applications and contributing to the full software development lifecycle. Key Responsibilities 1. Application Development: Design, develop, test, and deploy scalable and high-performing .NET applications. Collaborate with cross-functional teams to define, design, and ship new features. 2. Database Management: Develop and optimize complex SQL queries and stored procedures. Ensure database performance, integrity, and security. 3. System Maintenance: Troubleshoot and resolve technical issues, bugs, and performance bottlenecks. Update and maintain existing systems to improve functionality and efficiency. 4. Code Quality: Write clean, maintainable, and efficient code following best practices. Conduct code reviews and ensure adherence to coding standards. 5. Documentation and Reporting: Prepare technical documentation for applications, including design specifications and user guides. Provide regular progress reports and contribute to project planning. Skills and Qualifications Technical Skills: Programming Languages: Strong proficiency in C#, ASP.NET, .NET Core. Database: Advanced knowledge of SQL, stored procedures, and database optimization. Frameworks & Tools: Experience with Entity Framework, LINQ, and Web API development. Web Technologies: Familiarity with HTML5, CSS3, JavaScript, and frameworks like Angular or React is a plus. Version Control: Proficient in Git or other version control systems. Cloud Platforms: Experience with Azure or AWS (preferred). Soft Skills: Strong analytical and problem-solving abilities. Excellent communication and collaboration skills. Attention to detail and a proactive approach to challenges. Education: Bachelor's degree in Computer Science, Information Technology, or a related field. Preferred Qualifications Hands-on experience with database systems like SQL Server or PostgreSQL. Knowledge of CI/CD pipelines and DevOps practices. Familiarity with Agile methodologies and tools like JIRA. Why Join Us? Work on cutting-edge technologies and challenging projects. Be part of a collaborative and growth-focused team. Opportunities for professional development and career progression.
Posted 4 days ago
4.0 - 8.0 years
6 - 16 Lacs
hyderabad, chennai, bengaluru
Work from Office
We are seeking a highly skilled Zuora Billing Specialist with strong expertise in subscription billing, invoicing, customer account management, and payment processing . The ideal candidate will have proven experience in Zuora Billing, Zuora APIs, Zuora Workflows, and Advanced SQL , along with solid integration knowledge across Salesforce, Avalara, and multiple Payment Gateways . This role will focus on optimizing billing processes, enhancing Zuora platform capabilities, automating workflows, and supporting end-to-end subscription lifecycle management in a fast-paced consulting environment. Key Responsibilities: Manage and configure Zuora Billing including customer accounts, subscriptions, invoicing, billing rules, rate plans, and payment processing . Design, develop, and optimize Zuora Workflows for automation and process efficiency . Configure and administer Zuora platform objects, product catalog, custom fields, and billing configurations . Support and enhance the Order Lifecycle Management process, including Order Harmonization, subscription changes, amendments, and renewals . Develop and maintain Zuora API integrations with Salesforce, Avalara, Payment Gateways, and internal systems . Leverage Advanced SQL for data analysis, reporting, reconciliation, and troubleshooting across Zuora and integrated platforms. Collaborate with cross-functional teams to support business requirements, system enhancements, and issue resolution . Required Skills & Experience 4+ years of hands-on experience with Zuora Billing (subscription management, invoicing, customer accounts). Strong expertise in Zuora APIs and integrations with enterprise systems (Salesforce, Avalara, Payment Gateways). Proven experience in Order Management and Order Harmonization within Zuora. Demonstrated ability in creating and optimizing Zuora Workflows for process automation. Strong knowledge of Zuora object model, platform configurations, billing rules, and product catalog management . Proficiency in Advanced SQL for querying, reporting, and troubleshooting (mandatory).
Posted 6 days ago
8.0 - 12.0 years
0 Lacs
kochi, kerala
On-site
At EY, you will be part of a globally connected powerhouse of diverse teams that will shape your future with confidence and help you succeed. As a Data Modeller Developer with Cloud Exposure at EY GDS Data and Analytics (D&A) team, you will play a crucial role in solving complex business challenges through data and technology. You will have the opportunity to work in various sectors like Banking, Insurance, Manufacturing, Healthcare, Retail, Supply Chain, and Finance. Your responsibilities will include developing and maintaining complex data models, collaborating with stakeholders to understand data requirements, evaluating data modelling tools and technologies, creating data dictionaries and metadata repositories, optimizing database performance, and staying updated with industry best practices in data modelling. You will also be expected to have hands-on experience in data modelling tools, proficiency in SQL, and familiarity with data warehousing and ETL processes. To qualify for this role, you must have at least 8 years of experience in data modelling, a strong understanding of SQL, proficiency in tools like Erwin/Visio or Power Designer, and experience in designing and implementing database structures. Cloud knowledge and certification, particularly Azure DP-203 certification, will be advantageous. Strong analytical and problem-solving skills, excellent communication, and documentation skills are essential for this role. Primary skills required for this position include Data Modelling and Advanced SQL. Additionally, client management skills and a willingness to learn new things in a fast-paced environment are desirable qualities. Working at EY offers you inspiring projects, education, coaching, personal development opportunities, and the freedom to handle your role in a way that suits you best. EY is committed to building a better working world by creating new value for clients, people, society, and the planet. With a focus on data, AI, and advanced technology, EY teams help clients shape the future confidently and address pressing issues of today and tomorrow. As part of a globally connected network, you will have the opportunity to work across various services in more than 150 countries and territories.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
As a Cloud Data Platform Engineer, you will lead all aspects of a database platform. This includes database design, security, DR strategy, standard processes development, new feature evaluations, workload analysis for optimization opportunities at both system and application levels. Your responsibilities will involve driving automation efforts for effective database platform management and creating self-service solutions for users. Collaborating with development teams, product managers, and business users is essential to review, optimize, and tune solution designs. Additionally, you will address platform-wide performance and stability issues. This role requires someone who embraces challenges, approaches problems with innovative solutions, excels in collaborative environments, and contributes to building and supporting a large Enterprise Data Warehouse. Minimum Qualifications: - 4+ years of database technology experience with Snowflake (preferred), Teradata, BigQuery, or Redshift. - Proficiency in Advanced SQL. - Experience in DBA functions, DR strategy, data security, governance, automation, and tooling for database platforms. Key Qualifications: - Proficiency in object-oriented programming with Python or Java. - Ability to analyze production workloads and develop efficient strategies for running the Snowflake database at scale. - Expertise in performance tuning, capacity planning, cloud spend management, and utilization. - Experience with SaaS/PaaS enterprise services on GCP/AWS or Azure (preferred). - Familiarity with in-memory database platforms like SingleStore (a plus). - Experience with Business Intelligence (BI) platforms such as Tableau, Thought-Spot, and Business Objects (a plus). - Strong communication and interpersonal skills: capable of effective collaboration with various functional groups in project teams and a strong sense of project ownership. Education & Experience: - Bachelor's Degree in Computer Science Engineering or IT from a reputable school. Please submit your CV for consideration.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
kochi, kerala
On-site
As a Data Engineer at Fingent, you will play a crucial role in designing, building, and optimizing data pipelines to ensure our data infrastructure is scalable, efficient, and meets business requirements. With your 4 years of proven experience in Advanced SQL and query optimization, you will develop and maintain complex SQL queries for data extraction, manipulation, and reporting. Your strong knowledge of database performance fine-tuning techniques will be essential in ensuring efficient processing of large data sets. Collaborating closely with cross-functional teams, including data analysts, data scientists, and stakeholders, you will gather and understand business data requirements to deliver actionable insights. Your proficiency in ETL processes and visualization tools like PowerBI or Tableau will be valuable assets in creating data visualizations and ensuring smooth data flow from various sources to data warehouses if required. In this role, your responsibilities will also include monitoring and troubleshooting data pipelines to uphold data accuracy, quality, and timely delivery. Your strong problem-solving skills, attention to detail, and excellent communication and collaboration skills will be key to your success in this position. Additionally, your commitment to data security and compliance with company policies and industry regulations will contribute to maintaining a secure data environment. Join Fingent and be part of a team that uses technology to create a better and smarter future. Your expertise in data analysis, data cleaning, and data validation will be instrumental in driving innovation and change through smart software development and consulting services. Apply now and be a thought leader in shaping a brighter tomorrow.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Data Solutions Specialist at Quantiphi, you will be part of a global and diverse culture that values transparency, diversity, integrity, learning, and growth. We take pride in fostering an environment that encourages innovation and excellence, not only in your professional endeavors but also in your personal life. Key Responsibilities: - Utilize your 3+ years of hands-on experience to deliver data solutions that drive business outcomes. - Develop data pipelines using PySpark within Databricks implementations. - Work with Databricks Workspaces, Notebooks, Delta Lake, and APIs to streamline data processes. - Utilize your expertise in Python, Scala, and advanced SQL for effective data manipulation and optimization. - Implement data integration projects using ETL to ensure seamless data flow. - Build and deploy cloud-based solutions at scale by ingesting data from sources like DB2. Preferred Skills: - Familiarity with AWS services such as S3, Redshift, and Secrets Manager. - Experience in implementing data integration projects using ETL, preferably with tools like Qlik Replicate & QlikCompose. - Proficiency in using orchestration tools like Airflow or Step Functions for workflow management. - Exposure to Infrastructure as Code (IaC) tools like Terraform and Continuous Integration/Continuous Deployment (CI/CD) tools. - Previous involvement in migrating on-premises data to the cloud and processing large datasets efficiently. - Knowledge in setting up data lakes and data warehouses on cloud platforms. - Implementation of industry best practices to ensure high-quality data solutions. If you are someone who thrives in an environment of wild growth and enjoys collaborating with happy, enthusiastic over-achievers, Quantiphi is the place for you to nurture your career and personal development. Join us in our journey of innovation and excellence!,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
haryana
On-site
Job Description: As a Senior Tableau Developer with over 8 years of experience, you will be responsible for designing and developing dynamic Tableau dashboards and reports at our onsite location in Gurgaon, Sector 44. Your expertise in Tableau development, data visualization, and SQL will be crucial in translating business requirements into impactful dashboards. Your key responsibilities will include connecting to various data sources, optimizing performance, collaborating with business teams to gather requirements, ensuring data accuracy, and enhancing dashboard usability. Additionally, you will play a vital role in mentoring junior team members and adhering to BI best practices. To excel in this role, you must possess strong skills in Tableau Desktop & Server, advanced SQL (Joins, Subqueries, Views), data modeling, and performance optimization. A solid understanding of BI and data visualization principles is essential for success. While not mandatory, experience with Alteryx or Power BI, knowledge of cloud platforms such as AWS or Azure, and a Tableau Certification would be advantageous. Join us in leveraging your Tableau expertise to drive data-driven decision-making and deliver compelling visual insights to our organization.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
As a Database & Enterprise Warehouse Developer at ECS | Enterprise Change Specialists, you will play a crucial role in database development, administration, design, and handling "Extract Transform Load" (ETL) processes. This full-time on-site position based in Hyderabad requires expertise in Oracle, Advanced SQL, PLSQL, Redshift, Data Warehouse, ETL, Integration, and Big Data to support enterprise data warehousing solutions. Your responsibilities will include analytical tasks and ensuring the smooth functioning of database systems. You should have a minimum of 8-10 years of experience in Database & Enterprise Warehouse Development, with advanced SQL and PLSQL knowledge. Experience in handling bulk data processing, particularly in Redshift, Oracle, and SQL Server databases, is essential. A strong understanding of Warehouse data models, problem-solving abilities, debugging skills, and performance tuning are key requirements for this role. The ideal candidate will possess great numerical and analytical skills and have experience working in a Hadoop/Bigdata environment. The position is based in Dubai, UAE, and requires working from the office. Interested candidates who meet these qualifications can share their CVs with khaja.samiuddin@techecs.com. Join ECS and be part of a global IT consulting company that specializes in Technology Solutions and Services, offering opportunities in Application Development, Artificial Intelligence, and Cloud Migration. Apply now and contribute to bridging the gap between Business and IT with expertise in Oracle, Salesforce, and Microsoft systems.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
As an Analyst, you will be responsible for analyzing business requirements and converting them into technical specifications. You will collaborate with stakeholders to elicit and document user requirements, design and implement system solutions using SQL Server and other technologies, and perform data analysis and reporting to support business decision-making. Your role will involve developing and maintaining technical documentation for system processes and functionalities, conducting system testing and troubleshooting, and working closely with IT development teams to facilitate system enhancements and integrations. In this role, you will also be expected to assist in project management activities, ensuring timelines and budgets are met, conduct training sessions for end-users on system functionalities, and monitor and evaluate system performance for areas of improvement. You will engage with compliance and regulatory requirements specific to the insurance industry, support data modeling activities, and facilitate meetings between business units and technical teams to enhance collaboration. It is essential to keep abreast of industry trends and emerging technologies relevant to the insurance sector. To qualify for this position, you should have a Bachelor's degree in Computer Science, Information Technology, or a related field, along with proven experience as a System Analyst, preferably in the insurance sector. Strong expertise in SQL Server and database management, as well as a solid understanding of insurance processes and principles, are required. Proficiency in data analysis and reporting tools, technical documentation standards and practices, problem-solving, and analytical skills is essential. Excellent verbal and written communication abilities, the ability to work collaboratively in a team-oriented environment, and experience with project management methodologies are also necessary. Strong organizational skills with attention to detail, knowledge of regulatory and compliance issues in the insurance industry, familiarity with software development life cycle (SDLC) methodologies, and the ability to manage multiple priorities and meet deadlines are important for this role. A willingness to continuously learn and adapt to new technologies and relevant certifications in system analysis or project management will be advantageous. Your role will involve data analysis, user story creation, system analysis, project coordination, technical documentation, advanced SQL, problem-solving, and insurance domain knowledge.,
Posted 2 weeks ago
8.0 - 10.0 years
25 - 30 Lacs
chennai
Work from Office
Responsibilities Analyzing existing data sources Expert understanding of data models and various 1 to many, many to many and other patterns, normalization and denormalization patterns and purposes Profile data sources to reverse engineer data model and relationships between tables, identify key fields, and infer meaning of attributes Meet with system owners to tie observations of data patterns with business processes and use cases that lead to those patterns Conduct root cause analysis (RCA) to identify underlying issues and drive effective solutions. Perform frequency distribution analysis to identify patterns and trends in the data.Architecting data strategy for Future State Profile the data sources to identify anomalies, inconsistencies, and data quality issues. Take ownership of the data, ensuring accuracy, completeness, and reliability. Develop target data model that provides optimal long term functional opportunities Design the right change data capture, audit strategy Identify where reference tables are necessary Design mapping tables / helper tables to support configurable ETL Requirements: General Attributes Advanced sql skills, familiarity with a variety of DB technologies (oracle, sql server, etc) Comfortable with AWS environment and databricks Experience with ETL Demonstrate curiosity and a relentless pursuit of understanding complex datasets. Engage with various stakeholders, including clinicians, researchers, data scientists, and IT professionals, to gather requirements and ensure alignment. Communicate effectively with stakeholders at all levels, translating technical concepts into clear and actionable insights. Lead deep data analysis initiatives to extract meaningful insights and drive data-driven decision-making Collaborate with cross-functional teams to develop and implement data models and solutions that meet business objectives
Posted 2 weeks ago
8.0 - 10.0 years
25 - 30 Lacs
bengaluru
Work from Office
Responsibilities Analyzing existing data sources Expert understanding of data models and various 1 to many, many to many and other patterns, normalization and denormalization patterns and purposes Profile data sources to reverse engineer data model and relationships between tables, identify key fields, and infer meaning of attributes Meet with system owners to tie observations of data patterns with business processes and use cases that lead to those patterns Conduct root cause analysis (RCA) to identify underlying issues and drive effective solutions. Perform frequency distribution analysis to identify patterns and trends in the data.Architecting data strategy for Future State Profile the data sources to identify anomalies, inconsistencies, and data quality issues. Take ownership of the data, ensuring accuracy, completeness, and reliability. Develop target data model that provides optimal long term functional opportunities Design the right change data capture, audit strategy Identify where reference tables are necessary Design mapping tables / helper tables to support configurable ETL Requirements: General Attributes Advanced sql skills, familiarity with a variety of DB technologies (oracle, sql server, etc) Comfortable with AWS environment and databricks Experience with ETL Demonstrate curiosity and a relentless pursuit of understanding complex datasets. Engage with various stakeholders, including clinicians, researchers, data scientists, and IT professionals, to gather requirements and ensure alignment. Communicate effectively with stakeholders at all levels, translating technical concepts into clear and actionable insights. Lead deep data analysis initiatives to extract meaningful insights and drive data-driven decision-making Collaborate with cross-functional teams to develop and implement data models and solutions that meet business objectives
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
As an Azure Data Engineer at our company, you will be responsible for designing, developing, and deploying scalable data pipelines using Azure Data Factory, Databricks, and PySpark. Your role will involve working with large datasets (500 GB+) to optimize data processing workflows. Collaborating with cross-functional teams to prioritize project requirements and ensuring data quality, security, and compliance with organizational standards will be essential tasks. Troubleshooting data pipeline issues, optimizing performance, and effectively communicating technical solutions to non-technical stakeholders are key aspects of the role. To excel in this position, you should have at least 4 years of experience in data engineering and hold an Azure certification. Proficiency in Azure, Databricks, PySpark, SQL, data modeling, data integration, and workflow orchestration is crucial. Strong communication and interpersonal skills are necessary, along with the ability to work in a 2nd shift. Experience in product engineering would be a valuable asset. Additionally, having a minimum of 2-3 projects working with large datasets is preferred. Key Responsibilities: - Design, develop, and deploy scalable data pipelines using Azure Data Factory, Databricks, and PySpark - Work with large datasets (500 GB+) to develop and optimize data processing workflows - Collaborate with cross-functional teams to identify and prioritize project requirements - Develop and maintain data models, data integration, and workflow orchestration - Ensure data quality, security, and compliance with organizational standards - Troubleshoot data pipeline issues and optimize performance - Communicate technical solutions to non-technical stakeholders Requirements: - 4+ years of experience in data engineering - Azure certification is mandatory - Strong proficiency in Azure, Databricks, PySpark, SQL, data modeling, data integration, and workflow orchestration - Experience working with large datasets (500 GB+) - Strong communication and interpersonal skills - Ability to work in a 2nd shift - Experience in product engineering is a plus - Minimum 2-3 projects with experience working in large datasets Skills: azure, sql, integration, orchestration, communication skills, data security, databricks, workflow, modeling, data quality, data integration, data modeling, advanced SQL, workflow orchestration, data compliance, PySpark,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
delhi
On-site
You should have a minimum qualification of MCA/B.E./B.Tech in IT/CS/ECE with 4-7 years of relevant experience. Excellent communication skills are a must for this role. Your technical competencies should include advanced SQL knowledge, proficiency in Basic and Advanced Excel, strong understanding of SDLC methodologies such as agile and waterfall model, familiarity with tracking tools like JIRA, and proficiency in HTML/CSS. In terms of personal attributes, you should possess strong verbal and written communication skills to effectively collaborate with cross-functional teams. Additionally, strong interpersonal, analytical, and problem-solving skills will be essential for success in this role. Your responsibilities will include conducting impact analysis of client requirements, preparing analysis documents outlining system changes, configuring through SQL, conducting unit and integration testing, managing testing activities like test planning and defect management, developing comprehensive test cases, adhering to SLAs and targets, accurately estimating work, mentoring team members and new hires, and driving process improvement initiatives. This role requires someone who can handle medium to high complexity tasks effectively, work collaboratively with various teams, and contribute to the overall success of client implementations. Please note that this job description is a summary of the key requirements and responsibilities associated with the role. For more details, please refer to the original job posting.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Senior Azure Data Engineer based in Chennai, India, you will play a crucial role in designing and implementing data pipelines using Azure Synapse to integrate data from various sources and file formats into SQL Server databases. Your responsibilities will include developing batch and real-time data pipelines to facilitate the transfer of data to data warehouses and data lakes. Collaborating with the Data Architect, you will work on new Data projects by constructing Data pipelines and managing Master Data. Your expertise in data analysis, extraction, cleansing, column-mapping, data transformations, and data modeling will be essential in meeting business requirements. You will be tasked with ensuring Data Availability on Azure SQL Datawarehouse by monitoring and troubleshooting Data pipelines effectively. To excel in this role, you must have a minimum of 3 years of experience in designing and developing ETL Pipelines using Azure Synapse or Azure Data Factory. Proficiency in Azure services such as ADLS2, Databricks, Azure SQL, and Logic Apps is required. Your strong implementation skills in Pyspark and Advanced SQL will be instrumental in achieving efficient Data transformations. Experience in handling structured, semi-structured, and unstructured data formats is a must, along with a clear understanding of Data warehouse, Data lake modeling, and ETL performance optimization. Additional skills that would be beneficial include working knowledge of consuming APIs in ETL pipelines, familiarity with PowerBI, and experience in Manufacturing Data Analytics & Reporting. A degree in information technology, Computer Science, or related disciplines is preferred. Join our global, inclusive, and diverse team dedicated to enhancing the quality of life through innovative motion systems. At our company, we value diversity, knowledge, skills, creativity, and talents that each employee brings. We are committed to fostering an inclusive, diverse, and equitable workplace where employees feel respected and valued, irrespective of their background. Our goal is to inspire our employees to grow, take ownership, and derive fulfillment and meaning from their work.,
Posted 2 weeks ago
5.0 - 10.0 years
0 Lacs
haryana
On-site
You will be responsible for designing, building, and maintaining scalable and efficient data pipelines to facilitate the movement of data between cloud-native databases (e.g., Snowflake) and SaaS providers using AWS Glue and Python. Your role will involve implementing and managing ETL/ELT processes to ensure seamless data integration and transformation while adhering to information security and compliance with data governance standards. Additionally, you will be tasked with maintaining and enhancing data environments, including data lakes, warehouses, and distributed processing systems. It is crucial to utilize version control systems (e.g., GitHub) effectively to manage code and collaborate with the team. In terms of primary skills, you should possess expertise in enhancements, new development, defect resolution, and production support of ETL development using AWS native services. Your responsibilities will also include integrating data sets using AWS services such as Glue and Lambda functions, utilizing AWS SNS for sending emails and alerts, authoring ETL processes using Python and PySpark, monitoring ETL processes using CloudWatch events, connecting with different data sources like S3, and validating data using Athena. Experience in CI/CD using GitHub Actions, proficiency in Agile methodology, and extensive working experience with Advanced SQL are essential for this role. Furthermore, familiarity with Snowflake and understanding its architecture, including concepts like internal and external tables, stages, and masking policies, is considered a secondary skill. Your competencies and experience should include deep technical skills in AWS Glue (Crawler, Data Catalog) for over 10 years, hands-on experience with Python and PySpark for over 5 years, PL/SQL experience for over 5 years, CloudFormation and Terraform for over 5 years, CI/CD GitHub actions for over 5 years, experience with BI systems (PowerBI, Tableau) for over 5 years, and a good understanding of AWS services like S3, SNS, Secret Manager, Athena, and Lambda for over 5 years. Additionally, familiarity with Jira and Git is highly desirable. This position requires a high level of technical expertise in AWS Glue, Python, PySpark, PL/SQL, CloudFormation, Terraform, GitHub actions, BI systems, and AWS services, along with a solid understanding of data integration, transformation, and data governance standards. Your ability to collaborate effectively with the team, manage data environments efficiently, and ensure the security and compliance of data will be critical for the success of this role.,
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
hyderabad, telangana
On-site
The Tech Lead Power BI position at ValueMomentums Engineering Center in Hyderabad requires 7-10 years of experience and expertise in Advanced SQL, Power BI, Cognos, and SSRS. As a Tech Lead Power BI, you will be responsible for designing, developing, and implementing business intelligence reports and dashboards using Power BI and related BI tools. Your role will involve converting business requirements into technical specifications, creating visual interactive reports, identifying KPIs, and analyzing data to support decision-making. You will also be mentoring other BI teammates and providing end-user support. ValueMomentums Engineering Center is a team of passionate engineers dedicated to addressing complex business challenges in the P&C insurance value chain. The team focuses on Cloud Engineering, Application Engineering, Data Engineering, Core Engineering, Quality Engineering, and Domain expertise. The company invests in employee growth through its Infinity Program, offering role-specific skill development opportunities and impactful project contributions. Key Responsibilities: - Understanding complex business requirements in the BI context and designing data models - Converting business requirements into technical specifications - Creating dashboards and visual interactive reports using Power BI - Identifying KPIs and monitoring them consistently - Analyzing data to support decision-making - Designing, developing, and deploying Power BI scripts - Mentoring BI teammates and providing end-user support Requirements: - Proven experience in designing, developing, and maintaining reports in Power BI and Cognos - Knowledge of using Data Catalogs like Purview, Cognos Data Manager, etc. - Experience with MS SQL Server BI Stack, including SSRS, TSQL, Power Query, MDX, Power BI, and DAX - Familiarity with Cached Architecture (Qlik, Tableau cache data) is a plus - Experience with source code management tools such as GIT - Experience in the insurance or financial industry preferred - Experience delivering BI solutions within Agile and DevOps methodologies - Strong communication skills and attention to detail in high-pressure situations ValueMomentum is a leading solutions provider for the global property & casualty insurance industry, offering advisory, development, implementation, and maintenance services. The company is known for its deep domain and technology capabilities, empowering insurers to achieve sustained growth and enhanced stakeholder value. ValueMomentum's culture focuses on nurturing employees, celebrating wins, fostering collaboration, promoting diversity and inclusion, creating a fun work environment, and providing personalized onboarding experiences. Company Benefits: - Competitive compensation package - Career development opportunities, training & certification programs - Comprehensive health benefits and life insurance About the Company: ValueMomentum is headquartered in New Jersey, US, and is the largest standalone provider of IT services and solutions to insurers. The company's industry focus, technological expertise, and customer-first approach position it uniquely to drive momentum in customers" initiatives. ValueMomentum is a top 10 insurance-focused IT services firm in North America, trusted by leading insurance companies for digital, data, core, and IT transformation initiatives. Benefits: - Competitive compensation package - Career advancement opportunities - Comprehensive training and certification programs - Performance management and recognition for exceptional performers.,
Posted 2 weeks ago
1.0 - 5.0 years
0 Lacs
haryana
On-site
As an ETL Developer with our team, you will be responsible for a range of tasks including enhancements, new development, defect resolution, and production support of ETL development utilizing AWS native services. Your expertise will be crucial in integrating data sets through AWS services such as Glue and Lambda functions. Additionally, you will be utilizing AWS SNS for sending emails and alerts, authoring ETL processes using Python and PySpark, and monitoring ETL processes using CloudWatch events. Your role will also involve connecting with various data sources like S3, validating data using Athena, and implementing CI/CD processes using GitHub Actions. Proficiency in Agile methodology is essential for effective collaboration within our dynamic team environment. To excel in this position, you should possess deep technical skills in AWS Glue (Crawler, Data Catalog) with at least 5 years of experience. Hands-on experience with Python, PySpark, and PL/SQL is required, with a minimum of 3 years in each. Familiarity with CloudFormation, Terraform, and CI/CD GitHub actions is advantageous. Additionally, having experience with BI systems such as PowerBI and Tableau, along with a good understanding of AWS services like S3, SNS, Secret Manager, Athena, and Lambda, will be beneficial for this role. If you are a detail-oriented professional with a strong background in ETL development and a passion for leveraging AWS services to drive data integration, we encourage you to apply for this exciting opportunity on our team.,
Posted 3 weeks ago
5.0 - 10.0 years
4 - 7 Lacs
bengaluru
Work from Office
Were seeking a Senior Software Engineer or a Lead Software Engineer to join one of our Data Layer teams. As the name implies, the Data Layer is at the core of all things data at Zeta. Our responsibilities include: Developing and maintaining the Zeta Identity Graph platform, which collects billions of behavioural, demographic, locations and transactional signals to power people-based marketing. Ingesting vast amounts of identity and event data from our customers and partners. Facilitating data transfers across systems. Ensuring the integrity and health of our datasets. And much more. As a member of this team, the data engineer will be responsible for designing and expanding our existing data infrastructure, enabling easy access to data, supporting complex data analyses, and automating optimization workflows for business and marketing operations. Essential Responsibilities: As a Senior Software Engineer or a Lead Software Engineer, your responsibilities will include: Building, refining, tuning, and maintaining our real-time and batch data infrastructure Daily use technologies such as Spark, Airflow, Snowflake, Hive, Scylla, Django, FastAPI, etc. Maintaining data quality and accuracy across production data systems Working with Data Engineers to optimize data models and workflows Working with Data Analysts to develop ETL processes for analysis and reporting Working with Product Managers to design and build data products Working with our DevOps team to scale and optimize our data infrastructure Participate in architecture discussions, influence the road map, take ownership and responsibility over new projects Participating in on-call rotation in their respective time zones (be available by phone or email in case something goes wrong) Desired Characteristics: Minimum 5-10 years of software engineering experience. Proven long term experience and enthusiasm for distributed data processing at scale, eagerness to learn new things. Expertise in designing and architecting distributed low latency and scalable solutions in either cloud and on-premises environment. Exposure to the whole software development lifecycle from inception to production and monitoring Fluency in Python or solid experience in Scala, Java Proficient with relational databases and Advanced SQL Expert in usage of services like Spark and Hive Experience with web frameworks such as Flask, Django Experience in adequate usage of any scheduler such as Apache Airflow, Apache Luigi, Chronos etc. Experience in Kafka or any other stream message processing solutions. Experience in adequate usage of cloud services (AWS) at scale Experience in agile software development processes Excellent interpersonal and communication skills Nice to have: Experience with large scale / multi-tenant distributed systems Experience with columnar / NoSQL databases Vertica, Snowflake, HBase, Scylla, Couchbase Experience in real team streaming frameworks Flink, Storm Experience in open table formats such as Iceberg, Hudi or Deltalake
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
navi mumbai, maharashtra
On-site
As a Senior Software Engineer in the Software department, you will be responsible for the following key tasks: - Demonstrating proven work experience as a backend developer - Utilizing your hands-on experience with Java, specifically emphasizing Java 8 as a requirement. Knowledge of Java 11/17 will be an added advantage - Showing a strong understanding of Spring Framework, Spring Boot, and proficient skills in RESTful API design - Leveraging AI-assisted development tools like GitHub Copilot and ChatGPT to improve code quality, hasten development processes, and automate repetitive tasks - Employing AI models and frameworks such as Llama for tasks related to natural language understanding and generation specific to product features - Implementing and enhancing AI inference using Groq hardware accelerators to optimize performance-critical workloads - Utilizing Langsmith or similar AI workflow management tools to create, monitor, and enhance AI model pipelines and integrations - Having experience with Advanced SQL, PLSQL, and familiarity with version control systems, especially Git - Ensuring compliance with coding conventions and industry best practices - Possessing exceptional analytical and debugging skills The ideal candidate profile for this role includes: - Previous experience working with AI-powered development environments and tools to increase productivity and foster innovation - A strong interest in keeping abreast of AI trends and integrating AI-driven solutions into software products - Proficiency in Java, Sprint Boot, advanced SQL, and PL/SQL for intricate data querying and database programming - Familiarity with containerization using Docker, K8S, GCP If you are excited about the opportunity to work with cutting-edge technologies and contribute to the development of innovative software solutions, this role might be the perfect fit for you.,
Posted 4 weeks ago
9.0 - 13.0 years
0 Lacs
punjab
On-site
As a Salesforce Marketing Cloud Consultant in Sydney with over 9 years of experience, you are expected to have a strong domain knowledge of the Salesforce Marketing Cloud platform. Your responsibilities will include integrating marketing with Sales/Service Cloud and other external data systems for data push/pull, including CRMs, ERPs, eCommerce platforms, Google Analytics, and SMS. You should have a solid understanding of relational data models, SOAP APIs, REST APIs, and integration techniques, along with advanced SQL skills. Your role will involve designing marketing cloud journeys and campaigns based on data dependencies. Proficiency in Email Studio, Journey Builder, and campaign management is essential, including data configuration, audience creation, and utilizing SFMC platform capabilities. You should be adept at Amp Script for email personalization and creating complex cloud pages with Amp Script. A technical background with a history of understanding complex systems is required, along with the ability to work both independently and collaboratively in a team environment. Strong communication skills and team handling capabilities are crucial. Possession of Salesforce Marketing Cloud Consultant certification is mandatory for this role. Your experience in Email Studio, Journey Builder, Automation Studio, Web Studio/Cloud Pages, Amp Script, and Marketing Cloud API's (REST and SOAP) will be valuable. Key responsibilities include enabling and executing marketing automations, testing marketing automations with dynamic content, and designing and optimizing campaigns with strategies like A/B Testing and throttled sending. You will also be responsible for personalizing marketing campaigns, building and executing email campaigns using Content Builder, and leveraging AmpScript for dynamic content. Subscriber and data management tasks, working with Data Extensions, Profile management, and relational data models, are also part of your responsibilities. Additionally, you should be able to design mobile-responsive creatives, create Microsites using ExactTarget, and configure Exact Target Microsite Activity. Knowledge of UI and front-end web technologies like SSJS, HTML, CSS, and Javascript/jQuery will be considered a value addition to your role as a Salesforce Marketing Cloud Consultant.,
Posted 1 month ago
1.0 - 5.0 years
0 Lacs
pune, maharashtra
On-site
The ideal candidate should have 1 to 2 years of experience in providing technical training and mentoring. You must possess a strong understanding of Data Analytics and hands-on experience with Python, Advanced Python, R programming, SAS, and machine learning. Proficiency in SQL and Advanced SQL is essential, along with a basic understanding of Statistics. Knowledge of operating systems such as GNU/Linux and Network fundamentals is required. Additionally, familiarity with MS Office applications (Excel, Word, PowerPoint) is necessary. The candidate should be self-motivated, technology-driven, possess excellent analytical and logical skills, and be a good team player. Exceptional communication and presentation skills are a must, along with good aptitude skills being preferred. Responsibilities include the ability to quickly grasp new technologies and effectively train other employees. You should be capable of resolving technical queries, conducting training sessions, and ensuring placement-driven quality in the training process. The candidate should be able to work independently without constant supervision and actively participate in reviews and meetings. This is a full-time position with a day shift schedule. The work location is in person.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
You are a Mid-Level ETL Tester SQL Python Data Quality Engineer with 4-8 years of experience, specializing in data quality, functional testing, and advanced SQL skills. Your role at EIL Global, a prominent IT services provider based in Adelaide, Australia, involves ensuring data integrity and accuracy across systems in Chennai or Pune. You will be required to work on-site for 3 days a week. You must have a strong proficiency in Python or Java for automating testing processes and scripts. Additionally, proven experience in functional testing methodologies and advanced SQL skills are essential for efficiently extracting, manipulating, and validating large amounts of data. Experience in CI/CD pipelines, Selenium for automated web application testing, and Cucumber for behavior-driven development is crucial. As a Data Quality Engineer, you will collaborate with cross-functional teams to understand data requirements, conduct thorough functional testing, develop robust test scripts using Python and SQL, and implement CI/CD practices. Your responsibilities include monitoring and maintaining data quality standards, documenting testing activities, and continuously enhancing testing strategies for optimal data assurance outcomes. The interview process comprises an L1 interview, client interview, DIGI, HR interview, followed by an offer and onboarding. Your expertise in advanced SQL topics like Window Functions, Common Table Expressions, Subqueries, Analytical Functions, Full-Text Search, Hierarchical Queries, and Optimization Techniques is highly valued in this role. Join EIL Global to play a vital role in ensuring high standards of accuracy and consistency in data management.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
As a senior data consultant, you will serve as a trusted advisor and subject matter expert, utilizing data to drive actionable business insights. Your role involves leading client engagements, delivering high-quality data solutions, and collaborating with sales teams to create accelerators, architectural artifacts, and pre-sales assets for RFPs. In this position, you will lead strategic data consulting projects, working directly with clients to design and implement data-driven solutions. You will be responsible for developing and maintaining a library of accelerators, frameworks, and artifacts to support sales proposals and RFP responses. Additionally, you will collaborate with cross-functional teams to ensure successful client delivery and seamless handoffs between sales and project implementation. Key Skills / Technologies: Must-Have: - Data Analytics & Visualization (Tableau, Power BI, etc.) - Advanced SQL and data querying skills - Strong statistical and analytical expertise - Experience in data integration, cleansing, and modeling - Excellent communication and stakeholder management skills Good-to-Have: - Familiarity with programming languages (Python, R) for advanced analytics - Knowledge of data warehousing and big data platforms - Experience in a consulting or client-facing role - Familiarity with data governance and business intelligence frameworks Responsibilities: Client Consulting & Delivery: - Lead data analysis projects, define client requirements, and deliver actionable insights. - Design data models and visualizations to support strategic decision-making across various business areas. - Advise clients on data management and analytics best practices to optimize business processes. Sales & Pre-Sales Support: - Develop accelerators, consulting frameworks, and architectural artifacts for RFP responses and sales proposals. - Support sales team through client presentations, technical workshops, and pre-sales engagements. - Provide expert insights and technical recommendations aligning client needs with technology solutions. Collaboration & Mentoring: - Work closely with technical, sales, and delivery teams to ensure cohesive strategies and smooth client handoffs. - Mentor junior consultants and share best practices in data analysis and consulting methodologies. Required Qualifications: - Bachelors or Masters degree in Data Science, Business Analytics, Statistics, or related field. - 5+ years of experience in data consulting or analytics roles with client-facing responsibilities. - Demonstrated ability to develop data-driven solutions that enhance business performance. - Strong problem-solving and communication skills supporting both sales and client delivery. Why Join Us: - Contribute to transforming businesses through data on diverse, high-impact projects. - Drive sales success and ensure high-quality client delivery simultaneously. - Collaborate with a passionate team of experts, experiencing continuous professional growth in a dynamic environment with competitive benefits.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Documentation Specialist, you will be responsible for creating world-class customer-facing documentation that delights and excites customers. Your role involves removing ambiguity by documenting information effectively, leading to increased team efficiency and effectiveness. Your efforts will help convert tacit knowledge into implicit knowledge. You will manage a full region or multiple customers within a region, owning end-to-end communication and status reporting to both leadership and customers. Your responsibilities include managing your portfolio, estimates, asset projection, unit metrics, tracking CARR (Contracted Annual Recurring Revenue), asset transfers, and cloud costs for fully owned projects. Additionally, you will provide valuable data insights to customers, identify early warning signs for issues, and collaborate with Customer Success stakeholders. Collaborating effectively with stakeholders, managing escalations, planning transitions, and initiating hiring efforts are key aspects of your role. You will also drive initiatives to achieve target profit gross margin and CSAT score for your allocated portfolio, while prioritizing work aspects amidst changing timeframes and incomplete information. Your leadership skills will be crucial in mentoring, grooming, assessing, and providing balanced feedback to your team members. Regular performance discussions and tracking Individual Development Plans are essential. Additionally, you will act as a backup SEM for another region. Required Skills: - Advanced SQL & Unix experience - Strong ETL & Python support skills - Hands-on knowledge of Analytics Tools (Power BI or Tableau) - Good Healthcare knowledge - Fundamental ITIL Expertise - Proficiency in Support Processes (SLAs, OLAs, Product or application support) - Project and Program management abilities - Escalation & Team management skills - Problem-solving mindset - Excellent written and verbal communication skills - Ambitious and adaptable to work in a flexible startup environment with a focus on achieving goals.,
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |