About our Team
LexisNexis Legal & Professional, which serves customers in more than 150 countries with 11,800 employees worldwide, is part of RELX, a global provider of information-based analytics and decision tools for professional and business customers. Our company has been a long-time leader in deploying AI and advanced technologies to the legal market to improve productivity and transform the overall business and practice of law, deploying ethical and powerful generative AI solutions with a flexible, multi-model approach that prioritizes using the best model from today’s top model creators for each individual legal use case.
About the Role
Conditions of Employment: ***********
You must be a U.S. citizen to apply for this position.
You must successfully pass a background investigation and achieve Public Trust security clearance.
Must be located near the Horsham, PA location for a Hybrid onsite schedule
Requirements
• Bachelor’s Degree (Engineering/Computer Science preferred but not required); or equivalent experience required.
• Hands-on experience in Azure Synapse Analytics, Azure Databricks, or Hadoop environments.
• Minimum 3 years of strong experience developing Python-based Spark code for large-scale data processing.
• Solid understanding of Big Data concepts, distributed computing, and data lake-based medallion architecture.
• Strong communication and collaboration skills to work with cross-functional teams.
• Strong proficiency in Python for data engineering, ETL, and automation tasks.
• Good working knowledge of Azure services, including Azure Data Factory (ADF) or Synapse Pipelines, Azure Data Lake Storage (ADLS), and Azure Key Vault/Secrets Management.
• Strong experience with writing SQL
• Experience working in both Linux and Windows environments for development and deployment.
• Solid working knowledge of GIT with Azure DevOps (Repos, Pipelines).
• Experience deploying code to higher environments using CI/CD pipelines & DevOps practices in Azure.
• Ability to troubleshoot, optimize, and monitor data pipelines in production.
• Exposure to Data Warehouses like AWS Redshift, Azure Dedicated SQL Pool, Netezza, etc.
• Familiarity with data security, governance, and compliance in cloud environments.
• Exposure to monitoring and logging frameworks for data workloads.
• Exposure to Microsoft Fabric.
• Knowledge of industry best practices (e.g., code coverage).
• Knowledge of software development methodologies (Agile)
• Good documentation skills.
• Attention to detail.
• Strong oral and written communication skills.
Responsibilities:
• Interface with other technical personnel or team members to document, interpret, and finalize requirements.
• Produce code that is efficient, repeatable, without defects, and adherent to best practices such as naming conventions, encapsulation, etc.
• Write and review portions of detailed specifications for the development of data components.
• Complete data engineering bug fixes and issues, researching and identifying root causes as appropriate.
• Identify opportunities to apply automation or other tools to improve effectiveness or efficiency.
• Work closely with other development team members to understand product requirements and translate them into data engineering and/or data management designs.
• Innovate process improvements that enable efficient delivery and maintenance.
• Participate in the development processes, coding best practices, and code reviews.
• Oversee specific database management ensuring structure and dataflow adheres to department standards.
• Utilize various data workflow management and analysis tools.
• Maintain a good level of intimacy within a specific data content area.
• Participate in process improvement and compliance to successfully and consistently deliver high quality services on time, and to specification, resulting in flexibility to react quickly to changes in priorities or circumstances to meet business needs.
• Complete simple data engineering bug fixes and resolve technical issues as necessary.
• Work closely with other engineering team members to understand data and translate requirements.
• Operate in various development environments (Agile, etc.) while collaborating with key stakeholders.
• Keep abreast of new technology developments.
• All other duties as assigned.
Work in a way that works for you
We promote a healthy work/life balance across the organisation. We offer an appealing working prospect for our people. With numerous wellbeing initiatives, shared parental leave, study assistance and sabbaticals, we will help you meet your immediate responsibilities and your long-term goals.
Working for you
We know that your wellbeing and happiness are key to a long and successful career. These are some of the benefits we are delighted to offer:
About the Business
LexisNexis Legal & Professional® provides legal, regulatory, and business information and analytics that help customers increase their productivity, improve decision-making, achieve better outcomes, and advance the rule of law around the world. As a digital pioneer, the company was the first to bring legal and business information online with its Lexis® and Nexis® services.


U.S. National Base Pay Range: $71,600 - $119,400. Geographic differentials may apply in some locations to better reflect local market rates.



This job is eligible for an annual incentive bonus.

We know your well-being and happiness are key to a long and successful career. We are delighted to offer country specific benefits. Click here to access benefits specific to your location.
We are committed to providing a fair and accessible hiring process. If you have a disability or other need that requires accommodation or adjustment, please let us know by completing our Applicant Request Support Form or please contact 1-855-833-5120.
Criminals may pose as recruiters asking for money or personal information. We never request money or banking details from job applicants. Learn more about spotting and avoiding scams here.
Please read our Candidate Privacy Policy.
We are an equal opportunity employer: qualified applicants are considered for and treated during employment without regard to race, color, creed, religion, sex, national origin, citizenship status, disability status, protected veteran status, age, marital status, sexual orientation, gender identity, genetic information, or any other characteristic protected by law.
USA Job Seekers:
If an employer mentions a salary or salary range on their job, we display it as an "Employer Estimate". If a job has no salary data, Rise displays an estimate if available.
Join Elsevier’s generative AI team to design, prototype, and govern high-quality conversational AI experiences that deliver evidence-backed clinical answers at the point of care.
Law360 is hiring a remote Data Analyst to conduct survey-driven research, clean and manage datasets, and produce analyses and visualizations that support newsroom reports and rankings.
Help shape Brex’s analytics backbone by building scalable data pipelines and Core Data tables that enable data-driven decisions across the company.
Labcorp seeks an early-career Enterprise Data Architect (Level 1) in Durham, NC to support enterprise data modeling, data quality, and database deployment in a hybrid full-time role.
Expeditors is hiring a Data Engineer III / Senior to build scalable Azure data platforms and embed Generative AI features into production-grade pipelines that support analytics and applications.
Experienced data engineer needed to architect and deliver production-grade data pipelines and AI-ready data foundations for a Google Cloud-focused consulting firm in Reston, VA (hybrid).
Parsons is hiring a seasoned Data Engineer to build and maintain robust data pipelines and search/analytics solutions across cloud-native and constrained edge environments for mission-critical federal systems.
Barclays Services Corp seeks a Data Engineer AVP to lead the design and delivery of robust data pipelines, warehouses, and compliance-focused analytics for onboarding and risk systems at the Whippany campus.
Senior Data Engineer to lead enterprise data platform architecture, cloud migrations, and analytics infrastructure for Cardinal Health’s Digital Partner organization.
Lead the technical delivery and operational maturity of Siepe’s integrations and data pipelines, driving T-SQL excellence and reliable ETL/ELT for mission-critical financial workflows.
Our number one strategic priority continues to be the organic development of increasingly sophisticated information-based analytics and decision tools that deliver enhanced value to professional and business customers across the industries that we...
29 jobs