5.379 Ofertas de Hadoop en Argentina
Data Processing Specialist
Publicado hace 3 días
Trabajo visto
Descripción Del Trabajo
Assurant Buenos Aires, Buenos Aires Province, Argentina
Assurant Buenos Aires, Buenos Aires Province, Argentina
Queres sumarte a un equipo en pleno crecimiento? En Assurant nos encontramos en búsqueda de Data Processing Specialists (Especialistas en Procesamiento de datos).
Responsabilidades
- Comprender formularios de seguros complejos, identificar el tipo de póliza y realizar una actualización precisa de la base de datos.
- Hacer pagos a tiempo para asegurarse de que haya una cobertura continua.
- Validar documentos de acuerdo con los procedimientos establecidos por el cliente.
- Realizar llamadas telefónicas a asegurados, bancos y reconocidas compañías de seguros de Estados Unidos para validar información.
Nuestras
Trabajar en un ambiente multicultural interactuando con personas de diferentes países.
Poner en práctica tu inglés.
Trabajar en una empresa reconocida hace más de 10 años como Great Place to Work.
Para Postularse a Esta Posición Se Requiere
Nivel de inglés avanzado. (Excluyente)
Graduados de la escuela secundaria y/o estudiantes de los primeros años de carreras
universitarias o tecnicaturas.
Experiencia mínima de 1 año en puestos de data entry, back office, facturación, administración o similares es un plus.
Si consideras que reunís los requisitos para el puesto y te gustan los desafíos.
¡Envíame tu CV!
Nuestros Beneficios
OSDE 210 (family plan)
Descuento en SportClub (por si te gusta entrenar o estas buscando la motivación que te falta para arrancar )
Hybrid work
Día libre en tu cumpleaños (para festejar o recuperarse del festejo )
2 Mañanas / tardes libres por año.
Acceso gratuito a Udemy Learning (para los que no se cansan de aprender)
Reembolso para clases de inglés
Te reembolsamos una parte de la cuota de la universidad en carreras relacionadas (porque tu crecimiento profesional nos importa)
Gastos Home Office cubiertos mensualmente
Descuentos en productos de consumo masivo.
¡Y no queremos ser spoilers así que nos guardamos algunas cosas para contarte después!
This job posting is for future opportunities. If you're interested in joining our talent network and being considered for upcoming long-term roles, we encourage you to apply and stay connected.
Apply today! Trainee Data Processing Specialist - Buenos Aires
Greater Buenos Aires $600.00-$800.00 2 months ago
Buenos Aires, Buenos Aires Province, Argentina 2 hours ago
Vicente López, Buenos Aires Province, Argentina 1 day ago
Vicente López, Buenos Aires Province, Argentina 4 months ago
Buenos Aires, Buenos Aires Province, Argentina 1 month ago
Buenos Aires, Buenos Aires Province, Argentina 1 day ago
Buenos Aires, Buenos Aires Province, Argentina 1 week ago
Administrativo para Sector de Gestión de TrámitesBuenos Aires, Buenos Aires Province, Argentina 1 week ago
Gestoría Administrativa - Sector Automotriz Analista de Control y Seguimiento OperativoBuenos Aires, Buenos Aires Province, Argentina 1 day ago
Buenos Aires, Buenos Aires Province, Argentina 1 month ago
Buenos Aires, Buenos Aires Province, Argentina 1 month ago
Vicente López, Buenos Aires Province, Argentina 1 month ago
Buenos Aires, Buenos Aires Province, Argentina 1 month ago
Buenos Aires, Buenos Aires Province, Argentina 1 day ago
#J-18808-LjbffrData Processing Specialist
Publicado hace 8 días
Trabajo visto
Descripción Del Trabajo
Overview
Queres sumarte a un equipo en pleno crecimiento? En Assurant nos encontramos en búsqueda de Data Processing Specialists (Especialistas en Procesamiento de datos).
Responsibilities- Comprender formularios de seguros complejos, identificar el tipo de póliza y realizar una actualización precisa de la base de datos.
- Hacer pagos a tiempo para asegurarse de que haya una cobertura continua.
- Validar documentos de acuerdo con los procedimientos establecidos por el cliente.
- Realizar llamadas telefónicas a asegurados, bancos y reconocidas compañías de seguros de Estados Unidos para validar información.
Lugar de trabajo: Retiro, CABA. Híbrido.
Nuestros beneficios y entorno- Trabajar en un ambiente multicultural interactuando con personas de diferentes países.
- Poner en práctica tu inglés.
- Trabajar en una empresa reconocida hace más de 10 años como Great Place to Work.
- Nivel de inglés avanzado. (Excluyente)
- Graduados de la escuela secundaria y/o estudiantes de los primeros años de carreras universitarias o tecnicaturas.
- Experiencia mínima de 1 año en puestos de data entry, back office, facturación, administración o similares es un plus.
Si consideras que reunís los requisitos para el puesto y te gustan los desafíos. ¡Envíame tu CV!
Beneficios- OSDE 210 (family plan)
- Descuento en SportClub (por si te gusta entrenar o estás buscando la motivación que te falta para arrancar)
- Hybrid work
- Día libre en tu cumpleaños (para festejar o recuperarse del festejo)
- 2 Mañanas / tardes libres por año.
- Acceso gratuito a Udemy Learning (para los que no se cansan de aprender)
- Reembolso para clases de inglés
- Te reembolsamos una parte de la cuota de la universidad en carreras relacionadas (porque tu crecimiento profesional nos importa)
- Gastos Home Office cubiertos mensualmente
- Descuentos en productos de consumo masivo.
¡Y no queremos ser spoilers así que nos guardamos algunas cosas para contarte después!
This job posting is for future opportunities. If you're interested in joining our talent network and being considered for upcoming long-term roles, we encourage you to apply and stay connected.
Apply today!
#J-18808-LjbffrBusiness Analyst/Product Owner (with Data Analysis Focus) - EY Global Delivery Services
Publicado hace 2 días
Trabajo visto
Descripción Del Trabajo
Job Title: Business Analyst/Product Owner with a Data Analysis Focus
Role Overview:
A business-oriented professional with strong experience in data analysis and technology product management. This role bridges business needs and technical execution, requiring a deep understanding of how systems connect and the ability to coordinate cross-functional teams. While deep technical expertise is not required, a strong grasp of product lifecycle management, technical business requirements, and data analysis and reporting is essential.
Key Responsibilities:
- Translate business needs into clear, actionable functional requirements and user stories for a technical product.
- Build and maintain Business Requirements Documents (BRD), incorporating stakeholder input and addressing key elements such as rationale, workflows, visualizations, and potential edge cases.
- Collaborate closely with developers, technical product owners, and business stakeholders to ensure alignment and clarity.
- Lead and participate in weekly calls to discuss timelines, prioritization, blockers, and anticipated issues.
- Review and approve product functionalities and updates, ensuring they meet business and user expectations.
- Analyze and interpret data as well as effectively present data to leadership audiences to support strategic decisions.
- Troubleshoot issues and provide detailed responses to clarification questions from development teams (e.g., logic rules, UI behavior, data calculations).
- Coordinate with business owners and stakeholders to gather approvals and ensure alignment.
- Manage the formal product deployment process, including user story reviews, system approvals, and production pushes.
- Maintain access lists and user permissions for dashboards and tools.
- Monitor support channels (e.g., dashboard support email) and respond to user inquiries.
- Conduct content reviews from a business and assurance perspective to ensure outputs are client-ready.
- Collaborate with external teams (e.g., ASU RM) to identify and support additional technology needs for operationalizing tools like 360C.
- Oversee integrations with systems such as Resource Management System and SuccessFactors.
Requirements and Skills:
- Proficiency in Power BI, including dashboard creation, data interpretation, and report validation.
- Familiarity with Scrum and project management methodologies.
- Strong understanding of data analysis tools and business processes.
- Experience in business solutions; background in Talent/HR is a plus.
- Ability to manage multiple workstreams and stakeholder groups simultaneously.
Profile Characteristics:
- Analytical mindset with a strong business orientation.
- Excellent communication, coordination, and stakeholder management skills.
- Ability to work effectively with multidisciplinary teams and across time zones.
- Detail-oriented with a proactive approach to problem-solving and continuous improvement.
Job Requirements:
Education:
- Bachelor’s degree in Business Administration, Information Technology, Project Management, Data Analytics, or a related field; a master’s degree or relevant certifications (such as PMP, CAPM, or Six Sigma) are strongly preferred. Experience in technology management or product management may also be considered in lieu of advanced degrees.
Experience:
- Minimum of 7/8 years of experience in a medium-to-large environment, with a focus on technology management, business analysis, and data analysis.
- Experience in leading cross-functional teams and managing technology initiatives is preferred.
- Proven track record in data analytics, including data interpretation, reporting, and utilizing analytics tools to drive strategic decision-making.
- Experience in stakeholder management and collaboration with senior leadership is desirable.
Certification Requirements:
Preferred certifications for this role:
- Project Management Professional (PMP), Certified Associate in Project Management (CAPM), or Six Sigma certification is preferred.
- Data analytics certifications (e.g., Certified Analytics Professional or Google Data Analytics Certificate) are a plus.
- Familiarity with product management frameworks or certifications (e.g., Certified Product Manager) is beneficial.
Business Analyst/Product Owner (with Data Analysis Focus) - EY Global Delivery Services
Publicado hace 2 días
Trabajo visto
Descripción Del Trabajo
Job Title: Business Analyst/Product Owner with a Data Analysis Focus
Role Overview:
A business-oriented professional with strong experience in data analysis and technology product management. This role bridges business needs and technical execution, requiring a deep understanding of how systems connect and the ability to coordinate cross-functional teams. While deep technical expertise is not required, a strong grasp of product lifecycle management, technical business requirements, and data analysis and reporting is essential.
Key Responsibilities:
- Translate business needs into clear, actionable functional requirements and user stories for a technical product.
- Build and maintain Business Requirements Documents (BRD), incorporating stakeholder input and addressing key elements such as rationale, workflows, visualizations, and potential edge cases.
- Collaborate closely with developers, technical product owners, and business stakeholders to ensure alignment and clarity.
- Lead and participate in weekly calls to discuss timelines, prioritization, blockers, and anticipated issues.
- Review and approve product functionalities and updates, ensuring they meet business and user expectations.
- Analyze and interpret data as well as effectively present data to leadership audiences to support strategic decisions.
- Troubleshoot issues and provide detailed responses to clarification questions from development teams (e.g., logic rules, UI behavior, data calculations).
- Coordinate with business owners and stakeholders to gather approvals and ensure alignment.
- Manage the formal product deployment process, including user story reviews, system approvals, and production pushes.
- Maintain access lists and user permissions for dashboards and tools.
- Monitor support channels (e.g., dashboard support email) and respond to user inquiries.
- Conduct content reviews from a business and assurance perspective to ensure outputs are client-ready.
- Collaborate with external teams (e.g., ASU RM) to identify and support additional technology needs for operationalizing tools like 360C.
- Oversee integrations with systems such as Resource Management System and SuccessFactors.
Requirements and Skills:
- Proficiency in Power BI, including dashboard creation, data interpretation, and report validation.
- Familiarity with Scrum and project management methodologies.
- Strong understanding of data analysis tools and business processes.
- Experience in business solutions; background in Talent/HR is a plus.
- Ability to manage multiple workstreams and stakeholder groups simultaneously.
Profile Characteristics:
- Analytical mindset with a strong business orientation.
- Excellent communication, coordination, and stakeholder management skills.
- Ability to work effectively with multidisciplinary teams and across time zones.
- Detail-oriented with a proactive approach to problem-solving and continuous improvement.
Job Requirements:
Education:
- Bachelor’s degree in Business Administration, Information Technology, Project Management, Data Analytics, or a related field; a master’s degree or relevant certifications (such as PMP, CAPM, or Six Sigma) are strongly preferred. Experience in technology management or product management may also be considered in lieu of advanced degrees.
Experience:
- Minimum of 7/8 years of experience in a medium-to-large environment, with a focus on technology management, business analysis, and data analysis.
- Experience in leading cross-functional teams and managing technology initiatives is preferred.
- Proven track record in data analytics, including data interpretation, reporting, and utilizing analytics tools to drive strategic decision-making.
- Experience in stakeholder management and collaboration with senior leadership is desirable.
Certification Requirements:
Preferred certifications for this role:
- Project Management Professional (PMP), Certified Associate in Project Management (CAPM), or Six Sigma certification is preferred.
- Data analytics certifications (e.g., Certified Analytics Professional or Google Data Analytics Certificate) are a plus.
- Familiarity with product management frameworks or certifications (e.g., Certified Product Manager) is beneficial.
Big Data Software Engineer
Hoy
Trabajo visto
Descripción Del Trabajo
Job Description
As a Software Engineer III at JPMorgan Chase within the Commercial & Investment Bank, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm's business objectives.
The JPMorgan Chase Commercial & Investment Bank is undertaking a strategic, initiative called Client 360 aimed at developing a big data platform and Firmwide solution for Entity Resolution and Relationships. We are seeking a Big Data Software Engineer with skills and experience implementing large-scale, cloud platform processing internal and 3rd party data. This individual will work on groundbreaking work to implement new solutions for Client 360 - Entity Resolution and Relationships and enhance the existing platform.
Job Responsibilities
- Acquire and manage data from primary and secondary data sources
- Identify, analyze, and interpret trends or patterns in complex data sets
- Transform existing ETL logic on AWS and Databricks
- Innovate new ways of managing, transforming and validating data
- Implement new or enhance services and scripts (in both object-oriented and functional programming)
- Establish and enforce guidelines to ensure consistency, quality and completeness of data assets
- Apply quality assurance best practices to all work products
- Analyze, design and implement business-related solutions and core architectural changes using Agile programming methodologies with a development team
- Become comfortable with learning cutting edge technology stacks and applications to greenfield projects
Qualifications
- Proficiency in advanced Python programming, with extensive experience in utilizing libraries such as Pandas and NumPy.
- Experience in code and infrastructure for Big Data technologies (e.g. Spark, Kafka, Databricks etc.) and implementing complex ETL transformations
- Experience with AWS services including EC2, EMR, ASG, Lambda, EKS, RDS and others
- Experience developing APIs leveraging different back-end data stores (RDS, Graph, Dynamo, etc.)
- Experience in writing efficient SQL queries
- Strong understanding of linear algebra, statistics, and algorithms.
- Strong Experience with UNIX shell scripting to automate file preparation and database loads
- Experience in data quality testing; adept at writing test cases and scripts, presenting and resolving data issues
- Familiarity with relational database environment (Oracle, SQL Server, etc.) leveraging databases, tables/views, stored procedures, agent jobs, etc.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy
- Strong development discipline and adherence to best practices and standards.
Preferred Qualifications, Capabilities And Skills
- Experience in Data Science, Machine Learning and AI is a plus
- Financial Services and Commercial banking experience is a plus
- Familiarity with NoSQL platforms (MongoDB, AWS Open Search) is a plus
ABOUT US
JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world's most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management.
We offer a competitive total rewards package including base salary determined based on the role, experience, skill set and location. Those in eligible roles may receive commission-based pay and/or discretionary incentive compensation, paid in the form of cash and/or forfeitable equity, awarded in recognition of individual achievements and contributions. We also offer a range of benefits and programs to meet employee needs, based on eligibility. These benefits include comprehensive health care coverage, on-site health and wellness centers, a retirement savings plan, backup childcare, tuition reimbursement, mental health support, financial coaching and more. Additional details about total compensation and benefits will be provided during the hiring process.
We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants' and employees' religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation.
JPMorgan Chase & Co. is an Equal Opportunity Employer, including Disability/Veterans
About The Team
Our Corporate Technology team relies on smart, driven people like you to develop applications and provide tech support for all our corporate functions across our network. Your efforts will touch lives all over the financial spectrum and across all our divisions: Global Finance, Corporate Treasury, Risk Management, Human Resources, Compliance, Legal, and within the Corporate Administrative Office. You'll be part of a team specifically built to meet and exceed our evolving technology needs, as well as our technology controls agenda.
Big Data Software Engineer - Python
Publicado hace 3 días
Trabajo visto
Descripción Del Trabajo
Overview
As a Software Engineer III at JPMorgan Chase within the Commercial & Investment Bank, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives.
Responsibilities- Acquire and manage data from primary and secondary data sources
- Identify, analyze, and interpret trends or patterns in complex data sets
- Transform existing ETL logic on AWS and Databricks
- Innovate new ways of managing, transforming and validating data
- Implement new or enhance services and scripts (in both object-oriented and functional programming)
- Establish and enforce guidelines to ensure consistency, quality and completeness of data assets
- Apply quality assurance best practices to all work products
- Analyze, design and implement business-related solutions and core architectural changes using Agile programming methodologies with a development team
- Become comfortable with learning cutting edge technology stacks and applications to greenfield projects
- Proficiency in advanced Python programming, with extensive experience in utilizing libraries such as Pandas and NumPy
- Experience in code and infrastructure for Big Data technologies (e.g. Spark, Kafka, Databricks etc.) and implementing complex ETL transformations
- Experience with AWS services including EC2, EMR, ASG, Lambda, EKS, RDS and others
- Experience developing APIs leveraging different back-end data stores (RDS, Graph, Dynamo, etc.)
- Experience in writing efficient SQL queries
- Strong understanding of linear algebra, statistics, and algorithms
- Strong experience with UNIX shell scripting to automate file preparation and database loads
- Experience in data quality testing; adept at writing test cases and scripts, presenting and resolving data issues
- Familiarity with relational database environments (Oracle, SQL Server, etc.) leveraging databases, tables/views, stored procedures, agent jobs, etc.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy
- Strong development discipline and adherence to best practices and standards
- Experience in Data Science, Machine Learning and AI is a plus
- Financial Services and Commercial banking experience is a plus
- Familiarity with NoSQL platforms (MongoDB, AWS Open Search) is a plus
JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management.
We offer a competitive total rewards package including base salary determined based on the role, experience, skill set and location. Those in eligible roles may receive commission-based pay and/or discretionary incentive compensation, paid in the form of cash and/or forfeitable equity, awarded in recognition of individual achievements and contributions. We also offer a range of benefits and programs to meet employee needs, based on eligibility. These benefits include comprehensive health care coverage, on-site health and wellness centers, a retirement savings plan, backup childcare, tuition reimbursement, mental health support, financial coaching and more. Additional details about total compensation and benefits will be provided during the hiring process.
We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation.
JPMorgan Chase & Co. is an Equal Opportunity Employer, including Disability/Veterans
About The TeamOur Corporate Technology team relies on smart, driven people like you to develop applications and provide tech support for all our corporate functions across our network. Your efforts will touch lives all over the financial spectrum and across all our divisions: Global Finance, Corporate Treasury, Risk Management, Human Resources, Compliance, Legal, and within the Corporate Administrative Office. You’ll be part of a team specifically built to meet and exceed our evolving technology needs, as well as our technology controls agenda.
#J-18808-LjbffrData Engineer
Hoy
Trabajo visto
Descripción Del Trabajo
Background
Digital innovation is reshaping the insurance industry - We’re making it happen. Charles Taylor InsureTech was established to help insurance businesses drive change through the delivery of technology enabled solutions. We don’t have a one-size-fits-all approach or prescriptive methodology. We work consultatively with our clients to revitalise their operations, reinvent established processes and implement future-ready solutions that deliver measurable benefit and improve data-driven decision making.
Charles Taylor is a global provider of professional services and technology solutions dedicated to enabling the global insurance market to do its business fundamentally better. Dating back to 1884, Charles Taylor now is currently in more than 120 locations spread across 30 countries in Europe, the Americas, Asia Pacific, the Middle East and Africa.
Charles Taylor believes that it holds a distinctive position in its markets in that it is able to provide professional services and technology solutions in order to support every stage of the insurance lifecycle and every aspect of the insurance operating model. Charles Taylor serves a diversified blue-chip international customer base that includes national and international insurance companies, mutuals, captives, MGAs, Lloyd's syndicates and reinsurers, along with brokers, distributors and corporate insureds.
Charles Taylor has three distinct business areas – Claims Services, InsureTech and Insurance Management.
Charles Taylor was recently acquired by an investment company managed and controlled by Lovell Minnick Partners LLC. Lovell Minnick is a US Private Equity firm that invests in the global financial services industry, including related technology and business services companies, with a focus on helping to build long term value for clients, employees and shareholders. The acquisition will support the continuation of Charles Taylor's successful growth strategy, with a focus on expanding client relationships, broadening specialist capabilities and the range of services and technology solutions, deepening geographic coverage, and reinvesting in quality of service and technology.
For more information, please visit
The RoleWe are looking for a Senior Data Engineer with proven experience in Big Data ecosystems to design, build, and optimize data pipelines and aggregated models that enable reliable and scalable reporting. This role requires strong hands-on expertise with Spark, Scala, and Impala , as well as a solid background in implementing ETL processes in complex environments.
Key Responsibilities- Demonstrate and champion Charles Taylor Values by ensuring Agility, Integrity, Care, and Accountability and Collaboration.
- Design, develop, and maintain ETL processes for large-scale data ingestion, transformation, and integration.
- Build aggregated data models to support business reporting and analytics.
- Work with Big Data technologies (Spark, Scala, Impala) to ensure high-performance data pipelines.
- Optimize queries and processing workflows for efficiency and scalability.
- Partner with business stakeholders, data analysts, and BI teams to translate requirements into technical solutions.
- Ensure data quality, consistency, and governance across all layers of the architecture.
- Contribute to the continuous improvement of development practices and data engineering standards.
- 3+ years of experience in Data Engineering or similar roles.
- Proven expertise in Spark (Scala) and Impala within Big Data environments.
- Strong experience with ETL design, implementation, and optimization .
- Advanced knowledge of distributed computing and parallel processing .
- Solid understanding of data modeling and performance tuning.
- Familiarity with data governance, data quality frameworks, and best practices .
- Strong problem-solving skills and ability to work in agile, fast-paced environments.
- Excellent communication skills in English (written and spoken).
The Charles Taylor InsureTech team blends hands-on insurance expertise with fresh thinking from the worlds of technology consulting, financial services, ecommerce and beyond.
As a newly established business which is part of Charles Taylor, we combine the agility of a start up with the security and scale of a corporate. The result is a pragmatic-yet-pioneering approach that we’ve used to help clients reimagine central market systems, launch self-service digital insurance products, automate regulatory reporting requirements and more.
We are very proud of the fact that nine out of ten of our people recommend Charles Taylor as a place to work. We pride ourselves on having a positive work environment where our people are empowered to make the best decisions and where learning is valued highly and shared across our business.
We are very committed to ensuring our people are given continuous learning and development.As well as structured induction programmes and job training, we provide study support for relevant professional qualifications and have a Core Learning & Development Curriculum.
Charles Taylor is a fun and inclusive place to work where people are truly valued and encouraged to enjoy a host of social and sporting activities available. Quiz nights, tennis tournaments, football matches and a range of other events take place throughout the year
Equal Opportunity Employer
Here at Charles Taylor we are proud to be an Inclusive Employer. We provide an environment of mutual respect with zero tolerance to discrimination of any kind regardless of age, disability, gender identity, marital/ family status, race, religion, sex or sexual orientation.
Our external partnerships and the dedicated work we do in promoting a transparent and fair recruitment and selection process all contribute to the successful, inclusive and diverse culture and environment which we are proud to be a part of at Charles Taylor.
#J-18808-LjbffrSé el primero en saberlo
Acerca de lo último Hadoop Empleos en Argentina !
Data Engineer
Hoy
Trabajo visto
Descripción Del Trabajo
Overview
This is a full-time work from home opportunity for a star Data Engineer from LATAM.
IDT() is an American telecommunications company founded in 1990 and headquartered in New Jersey. Today it is an industry leader in prepaid communication and payment services and one of the world’s largest international voice carriers. We are listed on the NYSE, employ over 1300 people across 20+ countries, and have revenues in excess of $1.5 billion.
IDT is looking for a skilled Data Engineer to join our BI team and take an active role in performing data analysis, ELT/ETL design and support functions to deliver on strategic initiatives to meet organizational goals.
Responsibilities- Design, implement, and validate ETL/ELT data pipelines–for batch processing, streaming integrations, and data warehousing, while maintaining comprehensive documentation and testing to ensure reliability and accuracy.
- Maintain end-to-end Snowflake data warehouse deployments and develop Denodo data virtualization solutions.
- Recommend process improvements to increase efficiency and reliability in ELT/ETL development.
- Stay current on emerging data technologies and support pilot projects, ensuring the platform scales seamlessly with growing data volumes.
- Architect, implement and maintain scalable data pipelines that ingest, transform, and deliver data into real-time data warehouse platforms, ensuring data integrity and pipeline reliability.
- Partner with data stakeholders to gather requirements for language-model initiatives and translate into scalable solutions.
- Create and maintain comprehensive documentation for all data processes, workflows and model deployment routines.
- Should be willing to stay informed and learn emerging methodologies in data engineering, and open source technologies.
- 5+ years of experience in ETL/ELT design and development, integrating data from heterogeneous OLTP systems and API solutions, and building scalable data warehouse solutions to support business intelligence and analytics.
- Excellent English communication skills.
- Effective oral and written communication skills with BI team and user community.
- Demonstrated experience in utilizing python for data engineering tasks, including transformation, advanced data manipulation, and large-scale data processing.
- Design and implement event-driven pipelines that leverage messaging and streaming events to trigger ETL workflows and enable scalable, decoupled data architectures.
- Experience in data analysis, root cause analysis and proven problem solving and analytical thinking capabilities.
- Experience designing complex data pipelines extracting data from RDBMS, JSON, API and Flat file sources.
- Demonstrated expertise in SQL and PLSQL programming, with advanced mastery in Business Intelligence and data warehouse methodologies, along with hands-on experience in one or more relational database systems and cloud-based database services such as Oracle, MySQL, Amazon RDS, Snowflake, Amazon Redshift, etc.
- Proven ability to analyze and optimize poorly performing queries and ETL/ELT mappings, providing actionable recommendations for performance tuning.
- Understanding of software engineering principles and skills working on Unix/Linux/Windows Operating systems, and experience with Agile methodologies.
- Proficiency in version control systems, with experience in managing code repositories, branching, merging, and collaborating within a distributed development environment.
- Interest in business operations and comprehensive understanding of how robust BI systems drive corporate profitability by enabling data-driven decision-making and strategic insights.
- Experience in developing ETL/ELT processes within Snowflake and implementing complex data transformations using built-in functions and SQL capabilities.
- Experience using Pentaho Data Integration (Kettle) / Ab Initio ETL tools for designing, developing, and optimizing data integration workflows.
- Experience designing and implementing cloud-based ETL solutions using Azure Data Factory, DBT, AWS Glue, Lambda and open-source tools.
- Experience with reporting/visualization tools (e.g., Looker) and job scheduler software.
- Experience in Telecom, eCommerce, International Mobile Top-up.
- Preferred Certification: AWS Solution Architect, AWS Cloud Data Engineer, Snowflake SnowPro Core.
Please attach CV in English.
The interview process will be conducted in English.
Only accepting applicants from LATAM.
#J-18808-LjbffrData Engineer
Hoy
Trabajo visto
Descripción Del Trabajo
Join Our Data Products and Machine Learning Development Remote Startup!
Mutt Data is a dynamic startup committed to crafting innovative systems using cutting-edge Big Data and Machine Learning technologies.
We’re looking for a Data Engineer to help take our expertise to the next level. If you consider yourself a data nerd like us, we’d love to connect!
What We Do- Leveraging our expertise, we build modern Machine Learning systems for demand planning and budget forecasting.
- Developing scalable data infrastructures, we enhance high-level decision-making, tailored to each client.
- Offering comprehensive Data Engineering and custom AI solutions, we optimize cloud-based systems.
- Using Generative AI, we help e-commerce platforms and retailers create higher-quality ads, faster.
- Building deep learning models, we enhance visual recognition and automation for various industries, improving product categorization, quality control, and information retrieval.
- Developing recommendation models, we personalize user experiences in e-commerce, streaming, and digital platforms, driving engagement and conversions.
- Amazon Web Services
- Google Cloud
- Astronomer
- Databricks
- Kaszek
- Product Minds
- H2O.ai
- Soda
- We are Data Nerds
- We are Open Team Players
- We Take Ownership
- We Have a Positive Mindset
Curious about what we’re up to? Check out our case studies and dive into our blog post to learn more about our culture and the exciting projects we’re working on!
Responsibilities- Collaborate with the team to define goals and deliver custom data solutions.
- Innovate with new tools to improve Mutt Data's infrastructure and processes.
- Design and implement ETL processes, optimize queries, and automate pipelines.
- Own projects end-to-end—build, maintain, and improve data systems while working with clients.
- Develop tools for the team and assist in tech migrations and model design.
- Build scalable, high-performance data architectures.
- Focus on code quality—review, document, test, and integrate CI/CD.
- Experience in Data Engineering, including building and optimizing data pipelines.
- Strong knowledge of SQL and Python (Pandas, Numpy, Jupyter).
- Experience working with any cloud (AWS, GCP, Azure).
- Basic knowledge of Docker.
- Experience with orchestration tools like Airflow or Prefect.
- Familiarity with ETL processes and automation.
- Experience with stream processing tools like Kafka Streams, Kinesis, or Spark.
- Solid command of English for understanding and communicating technical concepts (Design Documents, etc.).
- 20% of your salary in USD
- Remote-first culture – work from anywhere!
- Gympass or sports club stipend to stay active.
- AWS & Databricks certifications fully covered (reward salary increase for AWS certifications!).
- Food credits via Pedidos Ya – because great work deserves great food.
- Birthday off + an extra vacation week (Mutt Week! ️)
- Referral bonuses – help us grow the team & get rewarded!
- – an unforgettable getaway with the team!
Data Engineer
Hoy
Trabajo visto
Descripción Del Trabajo
We're hiring a highly motivated Data Engineer with expertise in Python, AWS Glue/pyspark, AWS Lambda, Kafka, and relational databases (specifically Postgres). You'll be responsible for designing, developing, and maintaining data processing and management solutions, ensuring data integrity and availability through collaboration with multidisciplinary teams.
Responsibilities:
- Design and develop efficient data pipelines using Python, AWS Glue/pyspark, AWS Lambda, Kafka, and related technologies.
- Implement and optimize ETL processes for data extraction, transformation, and loading into relational databases, especially Postgres.
- Collaborate on data warehouse architectures, ensuring proper data modeling, storage, and access.
- Utilize tools like StitchData and Apache Hudi for data integration and incremental management, improving efficiency and enabling complex operations.
- Identify and resolve data quality and consistency issues, implementing monitoring processes across pipelines and storage systems.
Requirements:
- Strong Python skills and proficiency in associated libraries for data processing and manipulation.
- Expertise in AWS Glue/pyspark, AWS Lambda, and Kafka for ETL workflow development and streaming architecture.
- Experience in designing and implementing relational databases, specifically Postgres.
- Practical knowledge of data pipeline development, ETL processes, and data warehouses.
- Familiarity with data integration tools like StitchData and Apache Hudi for efficient incremental data management.
- Advanced level of English for effective communication and collaboration.
Benefits:
- Competitive salary.
- Annual offsite team trip.
- Learn from a very high performing team.
- Training from US mentors.
If you're a passionate Data Engineer with experience in these technologies and seek a stimulating and challenging environment, join our team and contribute to the success of our advanced data processing and management solutions!
#J-18808-Ljbffr