Upcoming Masterclass | [Exclusive Launch] Dictera - AI-Powered Course & Assessment Creation | 8:30 AM PDT | 11:30 AM EDT | 9:00 PM IST

Subject Matter Expert (SME) Big Data Analytics

Background:

We are seeking a Subject Matter Expert (SME) in Big Data & Analytics to lead the design, development, and optimization of data-driven solutions. The ideal candidate will have deep experience in big data technologies, data pipelines, and advanced analytics to drive business intelligence, predictive modeling, and strategic decision-making.

Scope of Work:

    • Create a course structure for a certificate program with 4-5 courses (number of courses to be based on scoping). Each course is likely to have 4-5 modules and a total of 25 lessons. So a 4-course program could have up to 100 lessons.
    • Work closely with the client in a rigorous scoping process to create a Job Task Analysis document and content structure for each program and course. 
    • Create program-level learning objectives for professional certificate courses. The number of objectives will  depend on the level – beginner, intermediate, or advanced – and the type of certification course.
  • Create course-level learning objectives aligned with the overall certification goal. 
  • Create module-level learning objectives based on skill development relevant to the TG’s career track. 
  • Review/create Course Outlines for each of the courses.
  • Review video scripts and confirm technical accuracy of the content, suggest edits and updates as required. Re-write content and codes as needed. Incorporate one round of internal and client feedback. 
  • Record talking head videos (onsite/virtually on Zoom) for each course. Incorporate one round of internal and client feedback. 
  • Provide relevant recorded demos/ screencasts to be integrated in the videos. Check the codes and technical accuracy before providing the demos for integration. Incorporate one round of internal and client feedback. 
  • For AI/software/tool-based courses, suggest relevant freeware. Write/review and test the codes to check.
  • Create/review 2-3 readings per lesson (why and what, 1500 words maximum per reading). The How readings should have detailed instructions/screenshots with short code block type practice that learners can do in their local environment.
  • Create One Coach item per lesson – review/reflect on key ideas
  • Create/review an ungraded lab per lesson – in-depth activity to apply skills in the learner’s local environment. 
  • Create/review practice quizzes for each lesson and suggest suitable edits, confirm technical accuracy. Incorporate one round of internal and client feedback. 
  • Create module-level and course-level graded assignments that meet ACE recommendation requirements with 2 additional variations to each item in an assessment bank for each course.
  • Create hands-on activities (3-4 lab or any other client preferred format) per course.  Incorporate one round of internal and client feedback. 
  • Create a minimum of one 3-5 min career resources video per course that showcases career path planning.
  • For all reviews – validate the content accuracy and provide recommendations/suggestions, write/re-write to fill content gaps as necessary, write/test codes and labs, incorporate 1 round of internal feedback and 2 rounds of client feedback. 
  • Be available for client discussions and content discussions as and when required. 
  • There will be an initial phase called ‘a Discovery Phase’ of 3-4 weeks with the client LDC, requiring 20-25 hours of weekly time by the SME.
  • During this, the SME will work in tandem with the Client stakeholders to:
    • 1. Scope and define the skill-based content that needs to be covered in each course.
    • 2. Co-create and validate a Job Task Analysis document and Content Development Format (CDF).

Requirements:

  • 8+ years of experience in data engineering, big data architecture, or analytics roles.
  • Strong expertise in Hadoop ecosystem (HDFS, Hive, Pig, HBase) and Apache Spark.
  • Proficiency in data integration tools and frameworks like Apache NiFi, Airflow, or Talend.
  • Experience with cloud platforms (AWS Redshift, Azure Synapse, Google BigQuery) and data lake/storage solutions.
  • Hands-on experience with SQL, Python, Scala, or Java.
  • Solid understanding of data warehousing, data modeling, and real-time data streaming (e.g., Kafka, Flink).
  • Familiarity with BI tools like Power BI, Tableau, or Looker.
  • Strong problem-solving and communication skills with the ability to explain technical concepts to non-technical stakeholders.

Preferred Qualifications:

  • Master’s or Bachelor’s degree in Computer Science, Data Science, Engineering, or related field.
  • Experience working in regulated industries (e.g., finance, healthcare) with a focus on data compliance and privacy.
  • Familiarity with AI/ML frameworks like TensorFlow, PyTorch, or MLlib.
  • Certifications in cloud platforms or big data technologies (e.g., AWS Big Data Specialty, GCP Data Engineer).

 

Timelines and Payout: 

Project start date : Immediate

Project Duration:  6 Months

 Time Availability : 25 hours per course

 Job Type : Contract

Work Location :  Remote

 

You must take the necessary steps to safeguard the integrity, security, and confidentiality of shared confidential information.

For additional information on Hurix, please visit: https://www.hurix.com/life-at-hurix/

Job Type: Freelance (0
Job Location: Remote

Apply for this position

Allowed Type(s): .pdf, .doc, .docx
Grab a FREE Website Accessibility Audit Today!

Quickly uncover your web accessibility issues with our free ADA and WCAG compliance checker.