J.McLaughlin

Data Analytics Engineer

Brooklyn, New York, USFull-timeYesterday

Company

J.McLaughlin

Job type

Full-time

Location

Brooklyn, New York, US

Posted

Yesterday

Salary

Not specified
Database Architect Resume Example

See a professional resume example for this role with key skills and ATS-friendly formatting.

View example

Tailor Your Resume to This Job

Mokaru reads this job description and creates a tailored resume for you, ready to send.

Create tailored resume

Job description

Why J.McLaughlin?

J.McLaughlin was founded in 1977 by brothers Kevin and Jay McLaughlin with a mission to create an American Sportswear brand that offered two key components: classic clothing with current relevance and a retail environment that has a neighborhood feel. The J.McLaughlin brand has always been more about style than fashion: straightforward, unpretentious, and devoid of the superfluous. Our clothing is rooted in the tradition of sport, work, and play. With over 180 retail locations, each store is entirely unique, attentively designed to reflect the town's color, character, and architecture. This attention to detail extends to exemplary customer service and local philanthropic engagement.

Our "Culture of Kindness" creates an environment with respect, politeness, consideration, and empathy that creates a family like atmosphere and focuses on giving back to the community. The company has an entrepreneurial spirit which fosters great experience and career opportunities, complemented with our great incentive benefits programs.

Overview

J.McLaughlin is a specialty American Sportswear and Accessories brand headquartered in New York. J.McLaughlin has the reputation for being "local and loyal", building meaningful relationships within each community and providing customers with highly personalized service. We are a growing company with a focus on our culture of kindness, cultivating an exceptional atmosphere in which to work and shop.

The Data Analytics Engineer reports to the IT Application Manager. This role will build and maintain the data infrastructure, transforming raw data into clean, reliable datasets for analysts and business users by designing data models, creating pipelines (ETL/ELT), implementing testing, and applying software engineering best practices, acting as a crucial bridge between data engineers and data analysts to enable data-driven decisions. Key responsibilities include data modeling, coding transformations (e.g. SQL and dbt), ensuring data quality through testing, documenting processes, and collaborating with stakeholders to meet business needs.

This role is a hybrid role primarily based in our Greenpoint, Brooklyn office.

About the role

Essential Functions

Data Modeling and Transformation

  • Designing and implementing structured data models (e.g. star/snowflake schemas) in data warehouses.
  • Writing code (often SQL) to clean, aggregate, transform, and enrich raw data into analysis-ready formats. Data Quality & Testing
  • Developing Automated tests and monitoring solutions for data accuracy and reliability. Pipeline Development
  • Building and maintaining data pipelines (ETL/ELT) to move and process data. Documentation, Collaboration, and Best Practices
  • Creating clear documentation for data models, transformations, and processes.
  • Working with data engineers to understand infrastructure and with analysts/stakeholders to define requirements.
  • Applying version control (Git) and CI/CD to analytics code. What we are looking for

Skills & Requirements

  • Bachelor's degree in IT Applications or related field or equivalent experience required.
  • 1-2 years of analytics or BI experience, preferably in the retail industry.
  • 2-3 years of technical experience with SQL, Python, Data Warehouse/Data Lake (Snowflake, BigQuery, Redshift, AWS), dbt, Airflow, Data Modeling.
  • Strong problem solving, critical thinking, and data interpretation skills.
  • Excellent written and oral communication skills with emphasis on teamwork and attention to detail.
  • Ability to operate with objectivity, integrity, professionalism, and confidentiality. Physical Requirements:
  • Must be able to access and navigate each department at the organization's facilities.
  • Prolonged periods of sitting at a desk and working on a computer. Equal Opportunity

J.McLaughlin is an Equal Opportunity Employer and does not discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability or any other legally protected status.

Responsibilities

  • This attention to detail extends to exemplary customer service and local philanthropic engagement
  • The Data Analytics Engineer reports to the IT Application Manager
  • This role will build and maintain the data infrastructure, transforming raw data into clean, reliable datasets for analysts and business users by designing data models, creating pipelines (ETL/ELT), implementing testing, and applying software engineering best practices, acting as a crucial bridge between data engineers and data analysts to enable data-driven decisions
  • Key responsibilities include data modeling, coding transformations (e.g. SQL and dbt), ensuring data quality through testing, documenting processes, and collaborating with stakeholders to meet business needs
  • Data Modeling and Transformation
  • Designing and implementing structured data models (e.g. star/snowflake schemas) in data warehouses
  • Writing code (often SQL) to clean, aggregate, transform, and enrich raw data into analysis-ready formats
  • Data Quality & Testing
  • Developing Automated tests and monitoring solutions for data accuracy and reliability
  • Pipeline Development
  • Building and maintaining data pipelines (ETL/ELT) to move and process data
  • Documentation, Collaboration, and Best Practices
  • Creating clear documentation for data models, transformations, and processes
  • Working with data engineers to understand infrastructure and with analysts/stakeholders to define requirements

Qualifications

  • Applying version control (Git) and CI/CD to analytics code
  • Bachelor's degree in IT Applications or related field or equivalent experience required
  • 1-2 years of analytics or BI experience, preferably in the retail industry
  • 2-3 years of technical experience with SQL, Python, Data Warehouse/Data Lake (Snowflake, BigQuery, Redshift, AWS), dbt, Airflow, Data Modeling
  • Strong problem solving, critical thinking, and data interpretation skills
  • Excellent written and oral communication skills with emphasis on teamwork and attention to detail
  • Ability to operate with objectivity, integrity, professionalism, and confidentiality
  • Must be able to access and navigate each department at the organization's facilities
  • Prolonged periods of sitting at a desk and working on a computer

Stand out from other applicants

AI reads this job description and tailors your resume to match, optimized for ATS filters.

Similar jobs

Ready to land your next role?

Join thousands of professionals who use Mokaru to manage their job search. AI-powered resume tailoring, application tracking, and more.

Create Free Resume