RefWorks
Enterprise redesign that earned voluntary migration in high-stakes academic research
Redesigned every workflow that touched user data around one constraint: nothing changed without the user being able to verify integrity first. Legacy platform was sunset through voluntary adoption, not mandate.
-
Purpose: Understand why expert researchers resisted migration to a modern platform and design an experience trustworthy enough for high-stakes academic work.
-
Role: Senior UX Designer owning research, user validation, and design specification. Reframed the migration problem and specified workflows across deduplication and project isolation.
-
Team: 6–12 engineers, 1 PM, UX colleagues across multiple releases. Distributed across Ann Arbor, New York, Israel, and Uruguay.
-
Context: Enterprise academic research platform used by librarians, researchers, and institutions globally
-
Duration: ~4 years (discovery through legacy sunset)
-
Status: Shipped; legacy platform voluntarily sunset June 30, 2023
The users who influenced purchasing wouldn't move to the new platform
The new platform worked — for the wrong users. RefWorks had rebuilt its platform from the ground up. For students and casual researchers, the new version was more modern-looking and simpler to use. Adoption among those groups wasn't the problem.
Power users influenced institutional purchasing. The problem was librarians, research administrators, and systematic review specialists. These users managed large reference collections, ran complex workflows across databases, and supported faculty and research teams. They also controlled institutional purchasing decisions. If they refused to migrate, institutions would cancel rather than maintain a platform their most important users rejected.
The gap was existential. The new platform was gaining users it didn't need to convince while failing to move the users it couldn't afford to lose.
The platform couldn't prove data integrity — and for these users, unverified data was a career risk
The resistance wasn't about change. The new platform was simpler for basic tasks, which made it easy to adopt for students and casual users. But for power users running complex workflows — systematic reviews, large collection management, cross-database operations — the new platform was missing capabilities they depended on daily. These weren't users who feared change. They were users whose workflows exposed gaps the platform hadn't addressed yet.
Professional credibility depended on verifiable data. These users' reputations rested on data integrity they could personally verify. A systematic review with duplicate references could be invalidated. An institutional collection with corrupted citations could undermine published work. Every workflow that touched their data — deduplication, migration, collection management — was a point where integrity could silently break.
No amount of visual polish would resolve this. Until power users could verify data integrity at every step, adopting the platform meant accepting professional risk they had no reason to take.
Every workflow was redesigned so users could verify integrity before anything changed
One constraint applied everywhere. The intervention wasn't building new features. It was applying one structural principle across every workflow that touched user data: nothing changed without the user being able to see what would happen, verify correctness, and control the outcome.
- Deduplication: The system flagged potential duplicates but never removed them automatically. Users saw exactly which references were matched and made the final call. A redesigned view made side-by-side comparison possible where the previous interface had hidden the information needed to decide.
- Projects: Self-contained workspaces that isolated collections from each other, so work in one area couldn't silently corrupt another.
- Batch migration: Administrators could see everything that would transfer, identify discrepancies, and fix issues before committing — enabling white-glove support instead of hoping automation got it right.
Why This Mattered for Adoption
Adoption was binary, not gradual. In academic research, a platform that can't guarantee data integrity doesn't get used cautiously. It gets rejected entirely. Librarians and research administrators controlled institutional purchasing, and they would not recommend a tool they couldn't personally verify.
Speed and polish wouldn't have been enough. Legacy RefWorks was slow. Librarians would run deduplication, leave to get coffee, and come back to pages of manual review. They didn't mind, because the results were reliable. A faster, more modern platform that couldn't guarantee the same accuracy would have been rejected regardless of how much better it looked.
Verification had to be built into the workflow. Any approach that depended on users following the right steps, remembering workarounds, or completing training would have failed at scale. The system had to make integrity verifiable by default, not as a careful-user reward.
Key Strategic Decisions
Each decision changed how the system handled data integrity, verification, or user control: the gates that determined whether expert users would migrate.
- Observed: Users maintained multiple accounts to prevent cross-contamination between research projects.
- Decision: Shipped self-contained workspace isolation within a single account before building any migration tooling.
- Tradeoff: Added permissions and data architecture complexity. Delayed migration tooling.
- Trust Implication: Data separation became structurally enforced rather than dependent on workarounds, so reliability was demonstrated before asking users to commit.
- Observed: Deduplication selected one record to keep, but users needed to verify source databases and metadata before approving any exclusion.
- Decision: Required explicit user confirmation for every merge. Built comparison views for side-by-side verification.
- Tradeoff: Significantly slower than automated merging. Required substantial UI investment in comparison views.
- Trust Implication: The system could flag duplicates but could not act without user verification, keeping data authority with the researcher.
- Observed: Users did not trust batch migration to transfer data correctly across platforms.
- Decision: Designed user-initiated import with before/after reference counts and full transfer visibility.
- Tradeoff: Could not accelerate migration at institutional scale. Slower adoption curve.
- Trust Implication: Migration became a user-verified event rather than an administrative action, shifting risk acceptance to the person with the most at stake.
- Observed: Power users who controlled institutional purchasing were not migrating despite the new platform reaching feature parity milestones.
- Decision: Redefined migration readiness from feature parity to user-verified data integrity. Delayed legacy sunset until trust requirements were met.
- Tradeoff: Shifted engineering capacity from new features to reliability and verification work. Extended legacy maintenance.
- Adoption Implication: Sunset timing became dependent on demonstrated trust rather than feature completion, forcing investment in reliability work before setting a date.
Impact at a Glance
The legacy platform was sunset June 30, 2023. The features that addressed expert user requirements — workspace isolation, human-in-the-loop deduplication, and user-controlled migration — shipped across multiple releases leading up to that date.
- June 30, 2023: Legacy platform sunset through voluntary migration, not mandate.
- ~16 months between batch migration shipping and legacy sunset, after trust-building features were in place.
Metric Interpretation: These signals reflect delivery against user-defined requirements, not performance benchmarks. The legacy platform was sunset after the features expert users identified as prerequisites were shipped and adopted.
Durability: Adoption held because verification was structural. The platform met requirements users had defined, not because alternatives disappeared.
How This Sharpened My Judgment
Lessons from Real-World Use
- In high-stakes domains, trust beats features. Users tolerate slower workflows if reliability is proven and resist improvements that introduce uncertainty.
- Resistance often signals unmet requirements, not emotion. Asking "what are users protecting?" reveals more than assuming "users resist change."
- Transparent automation earns adoption. Showing users why a decision was made creates confidence black-box systems cannot.
- User-controlled migration builds trust that batch migration cannot. Letting users verify their own data, even if slower, creates advocates instead of reluctant adopters.
- Design the hardest trust problem first. Shipping workspace isolation before migration tools meant the platform could demonstrate reliability before asking users to commit.
A Transferable Pattern:
In high-stakes domains, users protect what works. Any improvement that introduces uncertainty will be resisted — not because users fear change, but because they're managing real risk.
The question isn't "How do we get them to move?" It's "What would make moving safe?"