Let’s assume that we’d like to develop a video streaming platform, here are basic challenges and some potential solutions.
Solution: Define goals → develop criteria to match goals → map measures and methods to criteria
Define Goals
The first challenge of the project is to determine what it means to be “intentional.” To do that, I first went to the core people problems identified from last half’s research:
- People have unclear mental models around the product..
- People lack a “sense of control” over their user experiences on a platform — this lack of agency contributes to people feeling worse about themselves.
Generate criteria
Next, you should identify the criteria to determine whether the effort addressed the two people problems.
Develop a clear mental model on:
- Are people able to differentiate the purposes between feed and player?
- To what extent does the player provides an immersive consumption experience?
Give people more control over their video experience:
- How easy is it for people to choose videos that are related to their interests?
- How easy is it for choose a chained video to watch next?
2. Using Artificial Intelligence to detect COVID-19
3. Real vs Fake Tweet Detection using a BERT Transformer Model in few lines of code
4. Machine Learning System Design
Attach methods to the criteria
You then map the criteria to methods and timeline based on the product development progress. There are two main phases — before and after public testing.
- Explore & define what to build: The focus of this phase is for design and research to determine what to build.
- Validate & refine what you build: This phase starts once the Eng team starts to publicly test the new features.
- On-platform, quantitative validation: Surveys to evaluate overall experience, rapid feedback surveys to test sentiment towards specific features, on-platform usability to ensure clarity and ease of use
- Off-platform, qualitative refinement: Qualitative sessions with people in the experiment groups to gain deep understanding of their experience using the new features.
Solution: An iterative, phased approach.
Here are some steps to make the evaluation process actionable:
- Can anything be tested live? First, a team can decide that concepts that don’t require Engineering to build new elements (e.g. a click-to-play channel player) could be easily evaluated with public testing. That leaves you with fewer concepts for qualitative research.
- Dive deep & iterate. You can take an incremental, iterative approach to evaluate concepts deemed required qualitative feedback. In all prototypes, we always incorporate concepts that have never been tested as well as refined versions previously tested concepts. We are able to effectively determine the best designs, refine them gradually, and form opinions early regarding the final end-to-end flow.
- Diversify testing flows. A helpful and easy way to test more prototypes is to create different groups and flows, and rotate them across participants. Rotation also helps eliminate the biases from effects such as ordering, where people tend to prefer the first variant presented to them.
Set limits and prioritize
Throughout the process, be clear about the max amount of roughly 4* prototypes you could present during a 60-minute session to ensure you got rich insights. You can also prioritize the prototypes and the questions to present.
*Number could vary based on prototype complexity.
Solution: Create small deadlines and think ahead.
Identify small deadlines
With the deadline in mind, you can map the session dates for all phases on the calendar and these session dates became small deadlines to plan early.
Plan & act super early
Sending requests and planning logistics early allows you to focus on conducting research and generating insights once the sessions start.
Leave room & time buffer to ensure quality
Buffers were crucial to ensure quality. Examples include: leads review and feedback, prototype pre-tests and bug fixes, etc.
Solution: Set up a standard process with template. Level up your vendors.
Use templates
Using templates ( screener, SQL code to pull participant recruitment lists, research plan, etc.) saves so much time because you didn’t need to think about the framework as much — all you need is to fill in the blanks.
Solution: Frequent and immediate engagements with leads.
Daily immediate team syncs
Daily syncs keeps team members close and aligned as a unit even through those changes.
Engaging leads early for alignment
Having bi-weekly leads reviews would ensure that the time is carved out.
Moreover, to protect the quality of the work, you might often feel in the mode to “cross things off your list”. You would be grateful when your manager would remind you to feel free to take more time to produce insights. Quality is easily sacrificed in a situation like this and as an engineer you should do my best to protect that.
Last, but not least, never say “No”, get into the habit of saying “Yes, and…” and propose an alternative. With clear alternative plans and communication of leads’ priority, people are receptive and understanding of re-prioritization.
Credit: BecomingHuman By: The AI LAB