Part IA Patient Named Edith & Why This Matters
In January 2010, a woman named Edith Harrison walked into the Penn Memory Center with complaints her family had been noticing for months — misplaced keys, repeated questions, a subtle but persistent unraveling of the routines that had anchored her life for decades. Her primary care physician had prescribed antibiotics, attributing her confusion to a possible urinary tract infection. It was not. As Dr. Jason Karlawish recounts in his landmark book The Problem of Alzheimer's, Edith's story is not an outlier. It is the norm. The average delay between symptom onset and an Alzheimer's diagnosis remains stubbornly long — often three to five years in primary care settings — and for Parkinson's disease, by the time a tremor is visible to the naked eye, roughly 60 to 80 percent of dopaminergic neurons in the substantia nigra have already been lost.
The question that animates this article is deceptively simple: What if the body itself could be the diagnostic instrument? Not through a blood draw or a PET scan scheduled months in advance, but through the micro-patterns of daily life — the way a person walks to the kitchen, the cadence of their voice on a phone call, the restlessness of their sleep at 3 a.m. These are the signals that wearable sensors and artificial intelligence are now learning to read.
This is not speculative futurism. A systematic review published in the Journal of Medical Internet Research in early 2026 evaluated studies using wearable and mobile health technologies for continuous monitoring of sleep, physical activity, and circadian rhythms, concluding that AI-driven digital biomarkers now offer a credible pathway toward scalable, early detection of cognitive impairment. A parallel scoping review in npj Digital Medicine analyzed 86 AI models built on digital biomarker data and found that models targeting Alzheimer's detection achieved an average AUC of 0.887 — placing them in the same accuracy range as many established radiological screening tools.
The clinical stakes are not abstract. Alzheimer's disease currently affects approximately 6.9 million Americans aged 65 and older, and the global prevalence of Parkinson's disease has more than doubled in the past 25 years. Together, these two neurodegenerative conditions account for the majority of dementia-related disability worldwide. Early detection does not yet equate to cure — but it opens the door to enrollment in disease-modifying trials, lifestyle interventions with demonstrated cognitive benefits, advance care planning, and the dignified management of a trajectory that families deserve to understand before crisis forces the conversation.
Yet for all this promise, the field remains fragmented. Most models lack external validation. Regulatory pathways are still being carved out. And the human side of this equation — the patient who removes a wristband because it itches, the caregiver who cannot parse a dashboard, the clinician who has seven minutes per visit and no protocol for interpreting wearable data — is the gap that no algorithm alone can close.
This article is an attempt to bridge those worlds: the clinical, the technical, and the deeply human. It is written for neurologists, biomedical engineers, geriatricians, data scientists, product designers, caregivers, and anyone who believes that the future of neurodegenerative care will be shaped as much by what we wear as by what we prescribe.
Part IIA Century of Fumbled Handoffs
The history of Alzheimer's research is, as Karlawish frames it, a story of brilliant discovery followed by catastrophic institutional neglect. Alois Alzheimer, Franz Nissl, Oskar Fischer, and Emil Kraepelin had identified the plaques and tangles that define the disease pathology by the early 1900s. But the two World Wars gutted Germany's scientific infrastructure, and America's post-war psychiatric establishment, dominated by Freudian psychodynamic theory, dismissed senile dementia as a natural consequence of aging or, worse, as unresolved neurosis.
It took until 1976 for Robert Katzman's essay in the Archives of Neurology to reframe Alzheimer's as a public health crisis rather than an inevitability of old age. Robert Butler's Pulitzer Prize-winning work, Why Survive? Being Old in America, laid bare the systemic neglect of aging populations. But bureaucratic failures compounded the problem — FDA approval delays and Medicare's inability to cover adequate care condemned countless patients to suffer in silence for decades longer.
A glimmer of hope emerged with President Obama's National Alzheimer's Project Act in 2011, which sought to coordinate research efforts, improve patient care, and accelerate the development of treatments. But as Karlawish notes, the emotional toll of the lost years — amplified by Freudism's missteps and institutional ignorance — remains a painful reminder of how societal and scientific neglect can devastate entire generations. His call for action goes beyond policy and funding, demanding a moral reckoning to ensure that Alzheimer's patients are never again abandoned in the shadows of ignorance and stigma.
Critical Milestones
Part IIIThe Hippocampus, the Amygdala & Why the Brain Betrays Itself
To understand what wearable sensors are trying to detect, you need to understand what is breaking. The hippocampus — named for the Greek hippokampos, meaning "seahorse," after its curved shape resembling the mythical fishtailed horses pulling Poseidon's chariot — is the brain's primary engine for memory formation, spatial navigation, and learning. It is part of the limbic system, deeply involved in converting short-term memories into long-term ones. And it is, tragically, one of the first structures to atrophy in Alzheimer's disease.
The hippocampus operates in tight coordination with the amygdala, which assigns emotional weight to experiences. This is why a scent can transport you back to childhood, or why the face of a loved one can trigger a physiological response before conscious recognition kicks in. The amygdala's involvement explains why emotionally significant memories are often the last to fade in Alzheimer's patients — a fact with direct implications for therapeutic interventions, from music therapy to personalized voice reminders delivered through wearable devices.
The brain encodes memory through five principal mechanisms: Association (linking new information to existing knowledge), Novelty (prioritizing unexpected events), Repetition (strengthening neural pathways through rehearsal), Emotional tagging (the amygdala's reinforcement of affect-laden experiences — we remember what our amygdala has linked to pleasure or pain), and Contextual binding (anchoring memories to environmental cues). In Alzheimer's disease, the hippocampal circuits responsible for encoding and retrieval are progressively dismantled. Short-term memory fails first. Autobiographical memory — the narrative of who you are — falls last. But when it goes, it takes identity with it.
In Parkinson's disease, the primary target is different but the devastation is comparable. Dopaminergic neurons in the substantia nigra degenerate, disrupting the basal ganglia circuits that govern voluntary movement. Depression, anxiety, REM sleep behavior disorder, constipation, loss of smell, and cognitive impairment are all part of the clinical picture, and many precede the tremor by years — sometimes a decade or more. This is precisely the window that digital biomarkers aim to exploit: the prodromal phase, where the disease is active but not yet visible.
Part IVTwo Diseases, One Design Challenge
While both conditions share the label "neurodegenerative," they attack different systems, present different clinical challenges, and demand different approaches from wearable technology.
| Dimension | Alzheimer's Disease | Parkinson's Disease |
|---|---|---|
| Primary pathology | Amyloid-beta plaques, tau tangles, hippocampal atrophy | Loss of dopaminergic neurons in substantia nigra; Lewy body accumulation |
| Cardinal symptoms | Memory loss, disorientation, language deterioration, personality changes | Tremor, rigidity, bradykinesia, postural instability, freezing |
| Prodromal signals | Gait slowing, speech changes, sleep fragmentation, executive dysfunction | REM sleep disorder, constipation, anosmia, depression, reduced arm swing |
| Wearable targets | Gait speed & variability, circadian disruption, speech cadence, GPS wandering | Tremor frequency & amplitude, freezing detection, gait asymmetry, falls, dyskinesia |
| Key design constraint | Cognitive impairment: may not understand device, may remove it | Motor impairment: may be unable to clasp, charge, or interact with device |
| Caregiver role | Device manager; behavioral alert interpreter; safety monitor | Medication timing support; fall response; mobility advocate |
The Progression Arc
Part VDigital Biomarkers: Reading the Body's Hidden Signals
The FDA defines a digital biomarker as "a characteristic or set of characteristics collected through digital health technologies, which serve as indicators of normal biological processes, pathogenic processes, or responses to exposure or interventions." In practice, this means data from accelerometers, gyroscopes, photoplethysmography sensors, microphones, and touchscreens — hardware already in devices millions carry daily.
The 2025 Gramkow et al. scoping review found that the most studied categories were rest and activity patterns (39%), speech (17%), and gait (14%). Only 16% reported diagnostic outcomes. Only 3% addressed prognosis. The field is generating tremendous signal but has not yet converged on the clinical endpoints that matter most.
npj Digital Medicine, 2025
Buracchio et al.
DiMe landscape review
Gait: The Sixth Vital Sign
Gait speed may be as clinically informative as blood pressure. Buracchio and colleagues found that an inflection point occurs roughly 12 years before MCI diagnosis — the annual decline rate shifts from –0.005 m/s/year to –0.023 m/s/year, a fivefold acceleration invisible to casual observation but unmistakable to a wrist or ankle sensor. IMU sensors can estimate step count, stride length, and variability with clinically useful accuracy. A 2025 University of Maryland study used a single wearable sensor during a Timed Up and Go test to distinguish PD from other parkinsonisms.
IMU-based step count accuracy ranges from –6.7% to +6.2%. Stride length estimation achieves median errors below 5%. Contact pressure sensors in instrumented socks capture stance/swing ratios with >95% correlation. Fusing geopositioning data with IMU signals further improves composite metrics. These levels suffice for longitudinal monitoring where the signal is change from baseline rather than absolute measurement.
Speech: The Sound of Cognitive Decline
Pauses, hesitations, word-finding failures, reduced syntactic complexity, and prosody changes are all markers AI can detect. Boston University researchers achieved 96% accuracy identifying AD from phone calls. IBM predicted onset seven years before diagnosis using subtle linguistic shifts. For Parkinson's, speech analysis captures hypophonia, monotone delivery, and articulation imprecision. Because speech data can be captured passively via smartphone microphone, it offers a uniquely scalable screening pathway.
Sleep: The Night Shift
Alzheimer's patients exhibit disrupted circadian rhythms — fragmented sleep, reduced slow-wave sleep, increased nighttime wakefulness. Mounting evidence suggests these patterns contribute to disease progression by impairing glymphatic clearance of amyloid-beta during sleep. The relationship appears bidirectional: amyloid disrupts sleep, disrupted sleep accelerates amyloid. Wearable actigraphy captures these patterns continuously, generating longitudinal data no clinic visit can replicate.
Eye Tracking & Fine Motor
MIT/Harvard AI models analyzing eye movement achieve ~93% accuracy identifying AD. Finger tapping speed and handwriting analysis (pressure, speed, stroke direction via touchscreen) add further dimensions to composite digital phenotyping. These can be administered via smartphone, making them accessible beyond specialist centers.
Gait Analysis
Speed, stride length, variability, asymmetry. Detectable 12+ years before MCI. IMU sensors on wrist, hip, ankle, or in shoes.
Speech Patterns
Pauses, word-finding, prosody, complexity, volume. 96% accuracy from phone calls (BU). Passive, scalable, no clinic visit.
Sleep & Circadian
Fragmentation, slow-wave loss, nighttime waking. Bidirectional with amyloid clearance. 39% of all AD digital biomarker studies.
Eye Tracking
Pupil dilation, saccade patterns, focus maintenance. 93% accuracy (MIT/Harvard). Potential neuroimaging replacement for screening.
Fine Motor
Finger tapping, handwriting pressure & stroke. Smartphone-administrable. Adds to composite risk scoring.
Autonomic Signals
Heart rate variability, skin conductance, temperature. Markers for delirium, distress, autonomic dysfunction in PD.
Part VIAI in the Diagnostic Pipeline
Deep learning models now analyze PET scans to detect amyloid-beta and tau patterns with precision outperforming many human readers. A UCSF team identified Alzheimer's pathology six years before clinical diagnosis. Stanford's MRI model achieved >94% accuracy distinguishing AD from healthy controls. But the paradigm shift may be in blood: C2N's PrecivityAD test, plasma-based p-tau 217, NfL, and GFAP assays combined with AI pattern recognition could democratize screening beyond centers with PET scanners.
A Nature Reviews Drug Discovery article published in March 2026 found that wearables are now integrated into interventional drug trials, used not just for monitoring but as primary and secondary endpoints. Both FDA and EMA are issuing frameworks for validation, data integrity, and patient safety in digital health technology integration. The LUMA trial (BIIB122, LRRK2 inhibitor), ACTIVATE trial (BIA 28-6156, GCase targeting), and AHEAD 3-45 (lecanemab in preclinical AD) all represent plausible disease-modifying approaches currently in progress.
Part VIIDesigning for Dignity
The most sophisticated sensor is useless if the patient will not wear it. Alzheimer's patients may perceive an unfamiliar wristband as foreign and remove it. Parkinson's patients may find clasps impossible. Patients with sensory changes may be irritated by textures no one else notices. The design challenge is empathic, not primarily technical.
Materials must be hypoallergenic, lightweight, breathable. Medical-grade silicone or soft fabric blends. No seams, edges, or rigid housings. Temperature-adaptive materials or gentle compression to soothe rather than irritate. The device should feel like clothing, not hardware.
Disguise devices as everyday objects: a classic wristwatch, a pendant, a brooch. Anything that reads as "medical device" triggers resistance. Neutral colors, soft textures, analog aesthetics. A smartwatch that looks like a traditional timepiece stays on the wrist.
No buttons, no screens, no batteries to swap. Kinetic or solar charging, auto-sync via Bluetooth, OTA firmware updates. If a caregiver must intervene daily, adoption will fail.
Alzheimer's: GPS geofencing, haptic cues for agitation, personalized audio (familiar voice, beloved music) triggered by distress signals. Parkinson's: Rhythmic cueing for freezing episodes, fall detection with auto-alerts, adaptive stabilization for tremor.
AI-powered triage filters routine data from actionable alerts. Predictive warnings — likely agitation, wandering, or falls — are far more valuable than after-the-fact notifications. Radical simplicity: what to show, what to suppress, when to escalate.
Encryption and strict access controls are non-negotiable. HIPAA is a floor, not a ceiling. A device that feels like surveillance erodes dignity. A device that feels like support preserves it. The line is thin.
Patients, caregivers, neurologists, geriatricians, nurses, PTs, and OTs at the design table — not as advisory board reviewers, but as co-creators from the earliest concept stage.
A radial visualization of daily activity patterns: walking speed, room transitions, medication cabinet openings, phone usage, sleep. Clinicians read it at a glance. Deviations from the patient's baseline serve as early warning flags. This paradigm — continuous, passive, longitudinal, baseline-relative — represents the future of wearable-based neurodegenerative care.
Part VIIIDelirium & Falls: Where Wearables Save Lives Today
Delirium is under-detected in ~60% of cases. It accelerates cognitive decline and increases dementia risk. Inouye's multicomponent strategies reduce incidence by up to 40%, but require continuous monitoring. Wearables fill this gap: HR variability and temperature sensors detect physiological signatures hours before behavioral symptoms. Sleep monitoring flags circadian disruption. Movement sensors distinguish purposeful activity from delirious restlessness. Smart home integration triggers environmental adjustments — dimming lights, reducing noise, prompting hydration.
| Clinical Need | Wearable Capability | Actionable Outcome |
|---|---|---|
| Delirium early warning | HRV, skin temp, SpO₂, sleep fragmentation | Alert hours before behavioral symptoms |
| Fall detection | Accelerometer + gyroscope fusion, auto-emergency call | Rapid response with GPS location |
| Fall prediction | Longitudinal gait speed decline, stride variability | Proactive PT referral or assistive device |
| Wandering prevention | GPS + geofencing + push notification | Real-time alert when patient exits safe zone |
| Medication risk | Med timing logs × dizziness/movement data | Identify meds contributing to fall/confusion |
| Environmental | Smart home sensors (light, floor, temp, noise) | Automated adjustments reducing triggers & hazards |
Part IXA Night in the Life
Margaret, 78 — Alzheimer's Diagnosis, History of Nocturnal Delirium
It is 2:14 a.m. Margaret's wearable — a soft silicone band that looks like the bracelet her granddaughter gave her — detects a 22% spike in heart rate, a 1.3°F skin temperature rise, and irregular movement sharply deviating from her nighttime baseline.
The AI model, trained on three months of Margaret's data, classifies the pattern as consistent with early delirium (confidence: 0.84) and initiates a tiered response:
Tier 1 (Automated): Smart home dims lights to warm 2700K. A recording of her daughter's voice reading a beloved passage plays at low volume. Room temperature drops 2°F.
Tier 2 (Caregiver alert): Sarah, two blocks away, gets a push notification: "Margaret's nighttime pattern suggests possible delirium onset. Calming environment activated. Check-in recommended within 30 minutes."
Tier 3 (Clinical escalation): If vitals don't stabilize in 45 minutes, or if movement sensors detect wandering, the system alerts Margaret's neurologist with a 4-hour data summary.
At 2:38 a.m., Margaret's heart rate settles. The environmental interventions appear to have worked. Sarah calls through the room speaker. Margaret responds, groggy but oriented. No fall. No ER visit. No fracture.
The next morning, the neurologist reviews a trend chart showing increased nocturnal episodes over two weeks. She adjusts evening medication timing and orders a urinalysis — exactly the kind of workup that was missed in Edith Harrison's case sixteen years earlier.
This scenario is illustrative — no single system executes the full sequence today. But every component (vital sign monitoring, movement analysis, smart home integration, tiered alerting, trend visualization) exists in current prototypes or commercial devices. The integration is the engineering challenge. The clinical logic already exists.
Part XThe Gaps We Cannot Afford to Ignore
Of 86 AI models reviewed, only 2 had external validation. Three had model calibration. Without multi-site, prospective validation, impressive AUCs cannot translate to reliable clinical tools. Altoida remains the only AD-specific device with both FDA clearance and CE marking.
Who owns continuous health data? How is it governed under HIPAA, GDPR, and emerging regulations? Training datasets remain disproportionately white, Western, and well-resourced. Until diversity is achieved, generalizability remains in question.
For a patient diagnosed with preclinical neurodegeneration today, actionable options remain limited. Early detection without early treatment risks creating people who know they're sick but can't yet be helped — a psychological burden the field hasn't addressed.
Caregiver burden: Every data stream creates monitoring obligations. Without clinical triage protocols, alerts become noise. Clinical workflow: A neurologist with 7 minutes per visit needs decision support tools that synthesize wearable data into actionable recommendations, not raw feeds. Without this, wearable data becomes sophisticated noise in an overburdened system.
Part XIWhere This Is Headed
Ultra-Early Screening
Composite panels — gait, speech, sleep, blood biomarkers — enabling screening decades before symptoms, through devices people already own.
Precision Therapeutics
AI synthesizing wearable data, genomic risk, and treatment response for individualized care plans that evolve with the patient.
Emotionally Adaptive AI
Devices detecting distress via voice, HR, and movement, responding with personalized interventions — a familiar voice, a song, a vibration.
Ambient Intelligence
Smart homes with sensors in floors, furniture, walls. The wearable disappears. Monitoring without wearing or charging anything.
Multimodal Fusion
Neuroimaging + genetics + blood biomarkers + digital phenotyping into unified models capturing full disease complexity.
Wearables as Endpoints
Continuous real-world data replacing episodic clinic assessments as primary trial endpoints — already underway per Nature Reviews Drug Discovery.
Self-healing, stretchable, breathable electronic materials advancing toward wearables indistinguishable from skin. E-tattoos and epidermal electronics — thin sensor patches applied directly to the body — combined with energy-harvesting (thermoelectric generators, sweat-powered biofuel cells) could operate indefinitely. The patient would not just forget they're wearing a sensor. They would be unable to tell.
Part XIIDesigning for Humanity
The wearable revolution will not be won by the most accurate algorithm or sensitive accelerometer. It will be won by teams — engineers, clinicians, designers, caregivers, and patients — who understand that technology must serve the person, not the other way around.
Karlawish's account of Edith Harrison is a reminder of what happens when the system doesn't see the patient. Inouye's work on delirium is a reminder that the most dangerous conditions are often the most invisible. Kaye's Mandala Model reminds us that the richest clinical insight sometimes comes not from a single test but from the quiet accumulation of daily patterns observed over time.
Those choices: how we validate with the rigor these tools deserve, how we regulate without stifling, how we ensure they work for all patients, how we design for the people who need them most, and whether we prioritize dignity over data.
Over a century after Alois Alzheimer described the plaques and tangles that bear his name, the disease lacks a cure. Parkinson's, first characterized in 1817, still lacks a therapy that halts its progression. But for the first time, we have tools to see them coming — not in a scan three months away, but in the rhythm of a footstep, the cadence of a voice, the restlessness of a night's sleep.
What we do with that sight is the question that will define the next chapter. The answer will be written not just in code and silicon, but in the quality of attention we choose to pay to the people who need it most.