Project Overview
About
Despite the rapid expansion of artificial intelligence into the domain of healthcare, social service providers—specifically, social workers, community health workers, and peer support specialists—have been relatively overlooked as potential users.
In collaboration with the IC2 Institute, my Capstone team researched the priorities, pain points, and perceptions of social service providers in order to identify opportunities for mindful technological integration. What problems would social service providers want AI to solve? How might AI enhance the ability of these workers to meet the needs of their complex caseloads? What would AI look like, from their perspective?
Ultimately, we want to imagine a future in which a greater diversity of healthcare professionals can benefit from AI, while also addressing the dilemmas that these technologies provoke.
Objective
Identify opportunities for AI to augment the experiences and contributions of social service providers within
integrated healthcare models.
Institution
University of Texas at Austin —
M.A. in Design Focused on Health
Client
IC2 Institute
Team
Isabel Alexander, Laura Long,
Karl Sheeran, Tanya Sasnouskaya
Role
Lead Researcher & Strategist
Skills
User interviews, contextual observations, intercept surveys, qualitative research synthesis,
graphic design, low-fidelity prototyping, concept validation
Timeline
14 weeks (part-time) — Spring 2024

Project Context
01. Research
Using a range of qualitative methods, we sought to understand the priorities, perceptions, and pain points of social service providers in order to augment their existing workflows via the mindful integration of AI.
Interviews
We conducted 16 in-depth interviews with social workers, peer support specialists, a foresight practitioner, and an emergency physician. To guide our research, we constructed a triangular framework with questions regarding the interrelated workflow components that all healthcare providers have in common: patient care, interdisciplinary collaboration, and administrative tasks. Meanwhile, we probed on factors that might influence AI adoption at individual, organizational, and societal levels.
16
Interviews
3
Community
Health Workers
3
Peer Support
Specialists
2
Subject Matter
Experts
8
Social Workers
Intercept Surveys
In addition to our structured interviews with recruited participants, we conducted intercepts at a recurring community health worker social event. We used this opportunity to distribute QR codes linked to a survey with a condensed set of interview questions.


Contextual Observations
I also did a ride-along with the Austin Community Health Paramedic (CHP) program—a novel initiative to connect patients with non-urgent 911 calls to community resources, instead of transporting them to the emergency department. Although community paramedics were not formally considered part of our scope, they provided a complimentary perspective to our interviews with social service providers.


Analogous Research
Lastly, we visited the IC2 Institute for a prototype demonstration of a work-in-progress technology to assist behavioral health providers and their patients with real-time emotion data tracking.

Synthesis and Insights

Our synthesis board, containing observations, patterns, analysis, and insights.
After synthesizing our primary research, we arrived at the following insights, which highlight the interconnected challenges—and potential avenues for improvement—experienced by social service providers.

Click to expand each insight below!
- 01
- 02
- 03
- 04
- 05
- 06
- 07
- 08
- 09

02. Define
Of the interviews that we conducted, three stories particularly stuck out to us. We created the following personas to collate our research and add a human touch to otherwise de-identified summaries. In general, these archetypes were useful for asking the right questions during ideation and providing guidelines for design development.
Personas




Samantha is a community health worker and resource coordinator at a small community clinic that primarily serves low-income and uninsured residents in her county. She chooses to take hand-written notes when meeting with her patients, because she has found that the use of a computer in the room feels impersonal and creates a barrier in the patient-provider relationship she is cultivating. After her patient encounter, Samantha has to re-document her notes into the EHR based on what she was able to jot down in the room. She states she is burnt out and frustrated by this inefficient use of her time, but feels forced to choose between the administrative and relational elements of her work. This leads her to feel like she is chronically underperforming.


Mattie is a social worker at a large academic hospital. Here, she sits down at her computer to prepare for an upcoming patient appointment. This patient has a complex history and is being seen by other providers, so there's a lot of health data to go through. Mattie told us she has spent up to 2 hours trying to make sense of the large quantity of information in the EHR. As a result, when she meets with the patient, she feels as though she is never able to go in with the full picture she wants.


Greg is a peer support specialist at an inpatient psychiatric hospital, where he serves as a mentor for patients experiencing mental health crises. Greg himself has been hospitalized for depression in the past, so he is able to find common ground with patients at this institution—providing coaching so that patients can meet their clinical and personal goals. Part of Greg's responsibility is accompanying his patient to psychiatric appointments—but in many cases, the psychiatrist does not understand his relation to the patient. As a non-clinician, Greg feels like there is a certain hierarchy in this space, and that his advocacy for his patient isn't valued.
Opportunity Framing

03. Ideate
We facilitated several workshops with our classmates, our client, and several of our end users to both generate and refine ideas.
Brainstorming Workshops
To incorporate fresh perspectives into our work, we conducted a sprint-style brainstorming workshop with fellow classmates. In preparation, my team created two stacks of cards: one with our insights (core problems we were trying to address) and one with scenarios (touch points within an SSP's workflow, such as chart review or interdisciplinary team meetings). We then had people randomly pick one card from each stack and generate as many ideas in just a few minutes that would solve for those unique criteria. From there, we synthesized all our ideas into six high-level solutions that we presented to our client. After discussing each idea, we had them rank the feasibility and impact of each proposition on a 2x2 matrix (pictured below).

Concept Inspiration
My team also had the privilege of attending the 2024 Health AI for All conference, which included keynote presentations on the social, technical, clinical, ethical, and policy implications of deploying AI in historically underserved communities. We learned more about the specific capabilities of certain AI platforms, as well as the general best practices around co-design with relevant stakeholders.

Concept Validation
Lastly, we conducted two co-design sessions with interviewees who work at a local hospital and state government agency, respectively. After presenting our preliminary concept to these stakeholders, we asked them to outline the best and worst case scenarios for the three components of our solution. This exercise helped us refine details and identify implementation considerations.

Dell Children's Hospital

Texas Department of Health and Human Services

04. Prototype
We proposed the customized implementation of Natural Language Processing (NLP) to streamline post-encounter documentation, illuminate patterns in longitudinal patient data, and engage other members of the care team around social determinants of health.
Natural language processing (NLP) is a machine-learning technology that helps organize, synthesize, and make sense of written and spoken data—often with more standardization and efficiency than manual human synthesis. In this case, it can process patient information (such as concerns, symptoms, questions, and dates) that is relevant for SSPs.

"This saves me time."



"This helps me see."



"This lets me advocate."



The Bottom Line
It may sound ironic, but artificial intelligence might just be the tool that helps us feel most human. We heard from social service providers about the spectrum of tasks that they are responsible for: some are more administrative and bureaucratic, while others are deeply personal and relational in nature. In general, social service providers want to be on relational side of the spectrum, with more time to engage with their patients. We want AI to help them do this—to let the human element shine.








