Neurotechnology · Digital Health · Clinical Translation

The Invisible Stethoscope

How wearable sensors and digital biomarkers are rewriting the clinical playbook for Alzheimer's and Parkinson's — and why the biggest breakthroughs may come from what patients wear, not what they swallow.
By Paddy · March 2026 · 30 min read · Revised & expanded from the January 2025 edition
🧠

Digital Phenotyping

Continuous gait, speech, and sleep data as passive windows into neurodegeneration

Wearable Sensors

From consumer smartwatches to medical-grade IMU devices in clinical trials

🔬

Clinical Translation

Bridging AI models with FDA-recognized endpoints and real-world care

🤝

Humane Design

Building for dignity, comfort, and the patient who forgets they're wearing it

Part IA Patient Named Edith & Why This Matters

In January 2010, a woman named Edith Harrison walked into the Penn Memory Center with complaints her family had been noticing for months — misplaced keys, repeated questions, a subtle but persistent unraveling of the routines that had anchored her life for decades. Her primary care physician had prescribed antibiotics, attributing her confusion to a possible urinary tract infection. It was not. As Dr. Jason Karlawish recounts in his landmark book The Problem of Alzheimer's, Edith's story is not an outlier. It is the norm. The average delay between symptom onset and an Alzheimer's diagnosis remains stubbornly long — often three to five years in primary care settings — and for Parkinson's disease, by the time a tremor is visible to the naked eye, roughly 60 to 80 percent of dopaminergic neurons in the substantia nigra have already been lost.

The question that animates this article is deceptively simple: What if the body itself could be the diagnostic instrument? Not through a blood draw or a PET scan scheduled months in advance, but through the micro-patterns of daily life — the way a person walks to the kitchen, the cadence of their voice on a phone call, the restlessness of their sleep at 3 a.m. These are the signals that wearable sensors and artificial intelligence are now learning to read.

This is not speculative futurism. A systematic review published in the Journal of Medical Internet Research in early 2026 evaluated studies using wearable and mobile health technologies for continuous monitoring of sleep, physical activity, and circadian rhythms, concluding that AI-driven digital biomarkers now offer a credible pathway toward scalable, early detection of cognitive impairment. A parallel scoping review in npj Digital Medicine analyzed 86 AI models built on digital biomarker data and found that models targeting Alzheimer's detection achieved an average AUC of 0.887 — placing them in the same accuracy range as many established radiological screening tools.

Clinical Context

The clinical stakes are not abstract. Alzheimer's disease currently affects approximately 6.9 million Americans aged 65 and older, and the global prevalence of Parkinson's disease has more than doubled in the past 25 years. Together, these two neurodegenerative conditions account for the majority of dementia-related disability worldwide. Early detection does not yet equate to cure — but it opens the door to enrollment in disease-modifying trials, lifestyle interventions with demonstrated cognitive benefits, advance care planning, and the dignified management of a trajectory that families deserve to understand before crisis forces the conversation.

Yet for all this promise, the field remains fragmented. Most models lack external validation. Regulatory pathways are still being carved out. And the human side of this equation — the patient who removes a wristband because it itches, the caregiver who cannot parse a dashboard, the clinician who has seven minutes per visit and no protocol for interpreting wearable data — is the gap that no algorithm alone can close.

This article is an attempt to bridge those worlds: the clinical, the technical, and the deeply human. It is written for neurologists, biomedical engineers, geriatricians, data scientists, product designers, caregivers, and anyone who believes that the future of neurodegenerative care will be shaped as much by what we wear as by what we prescribe.

Part IIA Century of Fumbled Handoffs

The history of Alzheimer's research is, as Karlawish frames it, a story of brilliant discovery followed by catastrophic institutional neglect. Alois Alzheimer, Franz Nissl, Oskar Fischer, and Emil Kraepelin had identified the plaques and tangles that define the disease pathology by the early 1900s. But the two World Wars gutted Germany's scientific infrastructure, and America's post-war psychiatric establishment, dominated by Freudian psychodynamic theory, dismissed senile dementia as a natural consequence of aging or, worse, as unresolved neurosis.

The Freudian stranglehold on psychiatry reduced dementia to vague concepts of repressed emotions and unresolved trauma, completely sidelining the biological pathology. Elderly patients with cognitive decline were stigmatized, ignored, or warehoused without meaningful treatment.— Drawn from Karlawish's account of mid-20th century psychiatric practice

It took until 1976 for Robert Katzman's essay in the Archives of Neurology to reframe Alzheimer's as a public health crisis rather than an inevitability of old age. Robert Butler's Pulitzer Prize-winning work, Why Survive? Being Old in America, laid bare the systemic neglect of aging populations. But bureaucratic failures compounded the problem — FDA approval delays and Medicare's inability to cover adequate care condemned countless patients to suffer in silence for decades longer.

A glimmer of hope emerged with President Obama's National Alzheimer's Project Act in 2011, which sought to coordinate research efforts, improve patient care, and accelerate the development of treatments. But as Karlawish notes, the emotional toll of the lost years — amplified by Freudism's missteps and institutional ignorance — remains a painful reminder of how societal and scientific neglect can devastate entire generations. His call for action goes beyond policy and funding, demanding a moral reckoning to ensure that Alzheimer's patients are never again abandoned in the shadows of ignorance and stigma.

Critical Milestones

1906
Alois Alzheimer presents the case of Auguste Deter at Tübingen, describing plaques and tangles. The audience shows little interest.
1910–1945
Two World Wars dismantle European neuroscience infrastructure. Alzheimer's research stalls. The Freudian paradigm takes hold.
1976
Robert Katzman publishes his landmark editorial reframing Alzheimer's as "a major killer."
1990
Dr. Sharon Inouye develops the Confusion Assessment Method (CAM), establishing delirium as a distinct clinical entity.
2004
Pittsburgh Compound B (PiB) enables the first in-vivo PET imaging of amyloid plaques in a living brain.
2011
The National Alzheimer's Project Act is signed, establishing a coordinated federal strategy.
2019–2023
The digital biomarker era begins. FDA issues guidance. Blood-based amyloid tests gain traction. Consumer wearables enter research protocols.
2024–2025
DiMe publishes V3+ validation framework. 86 AI models for AD digital biomarkers cataloged. Only 2 have external validation.
2026
Nature Reviews Drug Discovery reports wearables now integrated into interventional drug trials across neurology.

Part IIIThe Hippocampus, the Amygdala & Why the Brain Betrays Itself

To understand what wearable sensors are trying to detect, you need to understand what is breaking. The hippocampus — named for the Greek hippokampos, meaning "seahorse," after its curved shape resembling the mythical fishtailed horses pulling Poseidon's chariot — is the brain's primary engine for memory formation, spatial navigation, and learning. It is part of the limbic system, deeply involved in converting short-term memories into long-term ones. And it is, tragically, one of the first structures to atrophy in Alzheimer's disease.

Figure 1 · The Hippocampal-Amygdalar Memory System
HippocampusMemory · Navigation · LearningAmygdalaEmotion · Fear · TaggingSubstantiaNigraPrefrontal CortexFirst affected in Alzheimer'sEmotional memory anchorDopamine loss in Parkinson's
Simplified schematic of key brain structures affected in Alzheimer's and Parkinson's disease. In Alzheimer's, hippocampal atrophy is among the earliest structural changes. In Parkinson's, degeneration of dopaminergic neurons in the substantia nigra disrupts basal ganglia motor circuits.

The hippocampus operates in tight coordination with the amygdala, which assigns emotional weight to experiences. This is why a scent can transport you back to childhood, or why the face of a loved one can trigger a physiological response before conscious recognition kicks in. The amygdala's involvement explains why emotionally significant memories are often the last to fade in Alzheimer's patients — a fact with direct implications for therapeutic interventions, from music therapy to personalized voice reminders delivered through wearable devices.

Neuroanatomy Note — Five Pathways of Memory

The brain encodes memory through five principal mechanisms: Association (linking new information to existing knowledge), Novelty (prioritizing unexpected events), Repetition (strengthening neural pathways through rehearsal), Emotional tagging (the amygdala's reinforcement of affect-laden experiences — we remember what our amygdala has linked to pleasure or pain), and Contextual binding (anchoring memories to environmental cues). In Alzheimer's disease, the hippocampal circuits responsible for encoding and retrieval are progressively dismantled. Short-term memory fails first. Autobiographical memory — the narrative of who you are — falls last. But when it goes, it takes identity with it.

In Parkinson's disease, the primary target is different but the devastation is comparable. Dopaminergic neurons in the substantia nigra degenerate, disrupting the basal ganglia circuits that govern voluntary movement. Depression, anxiety, REM sleep behavior disorder, constipation, loss of smell, and cognitive impairment are all part of the clinical picture, and many precede the tremor by years — sometimes a decade or more. This is precisely the window that digital biomarkers aim to exploit: the prodromal phase, where the disease is active but not yet visible.

Part IVTwo Diseases, One Design Challenge

While both conditions share the label "neurodegenerative," they attack different systems, present different clinical challenges, and demand different approaches from wearable technology.

Table 1 · Comparative Clinical Profiles
DimensionAlzheimer's DiseaseParkinson's Disease
Primary pathologyAmyloid-beta plaques, tau tangles, hippocampal atrophyLoss of dopaminergic neurons in substantia nigra; Lewy body accumulation
Cardinal symptomsMemory loss, disorientation, language deterioration, personality changesTremor, rigidity, bradykinesia, postural instability, freezing
Prodromal signalsGait slowing, speech changes, sleep fragmentation, executive dysfunctionREM sleep disorder, constipation, anosmia, depression, reduced arm swing
Wearable targetsGait speed & variability, circadian disruption, speech cadence, GPS wanderingTremor frequency & amplitude, freezing detection, gait asymmetry, falls, dyskinesia
Key design constraintCognitive impairment: may not understand device, may remove itMotor impairment: may be unable to clasp, charge, or interact with device
Caregiver roleDevice manager; behavioral alert interpreter; safety monitorMedication timing support; fall response; mobility advocate
Many patients present with overlapping features (PD dementia, Lewy body dementia). The boundaries between conditions are increasingly recognized as porous.

The Progression Arc

Stage 1
Preclinical
No symptoms. Pathology accumulating silently. Wearables detect subtle gait, sleep, or speech shifts years before diagnosis.
Stage 2
Prodromal / MCI
Mild symptoms noticed by family. Wearables provide longitudinal data for clinical confirmation and trial enrollment.
Stage 3
Early Clinical
Diagnosis confirmed. Wearables shift to safety (fall detection, wandering) and treatment monitoring.
Stage 4
Moderate–Severe
Significant disability. Wearables support caregiver coordination, delirium detection, and quality-of-life interventions.

Part VDigital Biomarkers: Reading the Body's Hidden Signals

The FDA defines a digital biomarker as "a characteristic or set of characteristics collected through digital health technologies, which serve as indicators of normal biological processes, pathogenic processes, or responses to exposure or interventions." In practice, this means data from accelerometers, gyroscopes, photoplethysmography sensors, microphones, and touchscreens — hardware already in devices millions carry daily.

The 2025 Gramkow et al. scoping review found that the most studied categories were rest and activity patterns (39%), speech (17%), and gait (14%). Only 16% reported diagnostic outcomes. Only 3% addressed prognosis. The field is generating tremendous signal but has not yet converged on the clinical endpoints that matter most.

0.887
Average AUC for AI models detecting Alzheimer's via digital biomarkers
npj Digital Medicine, 2025
12.1 yr
Inflection point in gait speed decline before MCI diagnosis
Buracchio et al.
89%
Of digital health technologies in AD research are commercially available
DiMe landscape review

Gait: The Sixth Vital Sign

Gait speed may be as clinically informative as blood pressure. Buracchio and colleagues found that an inflection point occurs roughly 12 years before MCI diagnosis — the annual decline rate shifts from –0.005 m/s/year to –0.023 m/s/year, a fivefold acceleration invisible to casual observation but unmistakable to a wrist or ankle sensor. IMU sensors can estimate step count, stride length, and variability with clinically useful accuracy. A 2025 University of Maryland study used a single wearable sensor during a Timed Up and Go test to distinguish PD from other parkinsonisms.

Technical Detail — Sensor Accuracy

IMU-based step count accuracy ranges from –6.7% to +6.2%. Stride length estimation achieves median errors below 5%. Contact pressure sensors in instrumented socks capture stance/swing ratios with >95% correlation. Fusing geopositioning data with IMU signals further improves composite metrics. These levels suffice for longitudinal monitoring where the signal is change from baseline rather than absolute measurement.

Speech: The Sound of Cognitive Decline

Pauses, hesitations, word-finding failures, reduced syntactic complexity, and prosody changes are all markers AI can detect. Boston University researchers achieved 96% accuracy identifying AD from phone calls. IBM predicted onset seven years before diagnosis using subtle linguistic shifts. For Parkinson's, speech analysis captures hypophonia, monotone delivery, and articulation imprecision. Because speech data can be captured passively via smartphone microphone, it offers a uniquely scalable screening pathway.

Sleep: The Night Shift

Alzheimer's patients exhibit disrupted circadian rhythms — fragmented sleep, reduced slow-wave sleep, increased nighttime wakefulness. Mounting evidence suggests these patterns contribute to disease progression by impairing glymphatic clearance of amyloid-beta during sleep. The relationship appears bidirectional: amyloid disrupts sleep, disrupted sleep accelerates amyloid. Wearable actigraphy captures these patterns continuously, generating longitudinal data no clinic visit can replicate.

Eye Tracking & Fine Motor

MIT/Harvard AI models analyzing eye movement achieve ~93% accuracy identifying AD. Finger tapping speed and handwriting analysis (pressure, speed, stroke direction via touchscreen) add further dimensions to composite digital phenotyping. These can be administered via smartphone, making them accessible beyond specialist centers.

🚶

Gait Analysis

Speed, stride length, variability, asymmetry. Detectable 12+ years before MCI. IMU sensors on wrist, hip, ankle, or in shoes.

🗣️

Speech Patterns

Pauses, word-finding, prosody, complexity, volume. 96% accuracy from phone calls (BU). Passive, scalable, no clinic visit.

😴

Sleep & Circadian

Fragmentation, slow-wave loss, nighttime waking. Bidirectional with amyloid clearance. 39% of all AD digital biomarker studies.

👁️

Eye Tracking

Pupil dilation, saccade patterns, focus maintenance. 93% accuracy (MIT/Harvard). Potential neuroimaging replacement for screening.

Fine Motor

Finger tapping, handwriting pressure & stroke. Smartphone-administrable. Adds to composite risk scoring.

❤️

Autonomic Signals

Heart rate variability, skin conductance, temperature. Markers for delirium, distress, autonomic dysfunction in PD.

Part VIAI in the Diagnostic Pipeline

Deep learning models now analyze PET scans to detect amyloid-beta and tau patterns with precision outperforming many human readers. A UCSF team identified Alzheimer's pathology six years before clinical diagnosis. Stanford's MRI model achieved >94% accuracy distinguishing AD from healthy controls. But the paradigm shift may be in blood: C2N's PrecivityAD test, plasma-based p-tau 217, NfL, and GFAP assays combined with AI pattern recognition could democratize screening beyond centers with PET scanners.

94%
Stanford AI model accuracy distinguishing AD from healthy controls on MRI
6 yr
Advance detection by UCSF AI model analyzing PET scans before diagnosis
85–90%
Accuracy of AI-driven blood biomarker tests for early-stage Alzheimer's
Emerging Evidence — March 2026

A Nature Reviews Drug Discovery article published in March 2026 found that wearables are now integrated into interventional drug trials, used not just for monitoring but as primary and secondary endpoints. Both FDA and EMA are issuing frameworks for validation, data integrity, and patient safety in digital health technology integration. The LUMA trial (BIIB122, LRRK2 inhibitor), ACTIVATE trial (BIA 28-6156, GCase targeting), and AHEAD 3-45 (lecanemab in preclinical AD) all represent plausible disease-modifying approaches currently in progress.

Part VIIDesigning for Dignity

The most sophisticated sensor is useless if the patient will not wear it. Alzheimer's patients may perceive an unfamiliar wristband as foreign and remove it. Parkinson's patients may find clasps impossible. Patients with sensory changes may be irritated by textures no one else notices. The design challenge is empathic, not primarily technical.

The best wearable for a patient with Alzheimer's is the one they forget they are wearing. The worst is the one that reminds them, every minute, that something is wrong.— Drawn from the OHSU ISAAC living laboratory studies
Design Principle 1 — Comfort & Sensory Integration

Materials must be hypoallergenic, lightweight, breathable. Medical-grade silicone or soft fabric blends. No seams, edges, or rigid housings. Temperature-adaptive materials or gentle compression to soothe rather than irritate. The device should feel like clothing, not hardware.

Design Principle 2 — Familiarity Over Novelty

Disguise devices as everyday objects: a classic wristwatch, a pendant, a brooch. Anything that reads as "medical device" triggers resistance. Neutral colors, soft textures, analog aesthetics. A smartwatch that looks like a traditional timepiece stays on the wrist.

Design Principle 3 — Passive, Hands-Free Operation

No buttons, no screens, no batteries to swap. Kinetic or solar charging, auto-sync via Bluetooth, OTA firmware updates. If a caregiver must intervene daily, adoption will fail.

Design Principle 4 — Disease-Specific Adaptation

Alzheimer's: GPS geofencing, haptic cues for agitation, personalized audio (familiar voice, beloved music) triggered by distress signals. Parkinson's: Rhythmic cueing for freezing episodes, fall detection with auto-alerts, adaptive stabilization for tremor.

Design Principle 5 — Caregiver-Centric Dashboards

AI-powered triage filters routine data from actionable alerts. Predictive warnings — likely agitation, wandering, or falls — are far more valuable than after-the-fact notifications. Radical simplicity: what to show, what to suppress, when to escalate.

Design Principle 6 — Privacy, Security & Autonomy

Encryption and strict access controls are non-negotiable. HIPAA is a floor, not a ceiling. A device that feels like surveillance erodes dignity. A device that feels like support preserves it. The line is thin.

Design Principle 7 — Co-Design with Stakeholders

Patients, caregivers, neurologists, geriatricians, nurses, PTs, and OTs at the design table — not as advisory board reviewers, but as co-creators from the earliest concept stage.

The Mandala Model — Dr. Jeff Kaye, OHSU

A radial visualization of daily activity patterns: walking speed, room transitions, medication cabinet openings, phone usage, sleep. Clinicians read it at a glance. Deviations from the patient's baseline serve as early warning flags. This paradigm — continuous, passive, longitudinal, baseline-relative — represents the future of wearable-based neurodegenerative care.

Part VIIIDelirium & Falls: Where Wearables Save Lives Today

Delirium is under-detected in ~60% of cases. It accelerates cognitive decline and increases dementia risk. Inouye's multicomponent strategies reduce incidence by up to 40%, but require continuous monitoring. Wearables fill this gap: HR variability and temperature sensors detect physiological signatures hours before behavioral symptoms. Sleep monitoring flags circadian disruption. Movement sensors distinguish purposeful activity from delirious restlessness. Smart home integration triggers environmental adjustments — dimming lights, reducing noise, prompting hydration.

Table 2 · Wearable Capabilities in Delirium & Fall Management
Clinical NeedWearable CapabilityActionable Outcome
Delirium early warningHRV, skin temp, SpO₂, sleep fragmentationAlert hours before behavioral symptoms
Fall detectionAccelerometer + gyroscope fusion, auto-emergency callRapid response with GPS location
Fall predictionLongitudinal gait speed decline, stride variabilityProactive PT referral or assistive device
Wandering preventionGPS + geofencing + push notificationReal-time alert when patient exits safe zone
Medication riskMed timing logs × dizziness/movement dataIdentify meds contributing to fall/confusion
EnvironmentalSmart home sensors (light, floor, temp, noise)Automated adjustments reducing triggers & hazards
Based on commercially available devices and RADAR-AD, IDEA-FAST, and OHSU ISAAC consortia research.

Part IXA Night in the Life

Illustrative Clinical Scenario

Margaret, 78 — Alzheimer's Diagnosis, History of Nocturnal Delirium

It is 2:14 a.m. Margaret's wearable — a soft silicone band that looks like the bracelet her granddaughter gave her — detects a 22% spike in heart rate, a 1.3°F skin temperature rise, and irregular movement sharply deviating from her nighttime baseline.

The AI model, trained on three months of Margaret's data, classifies the pattern as consistent with early delirium (confidence: 0.84) and initiates a tiered response:

Tier 1 (Automated): Smart home dims lights to warm 2700K. A recording of her daughter's voice reading a beloved passage plays at low volume. Room temperature drops 2°F.

Tier 2 (Caregiver alert): Sarah, two blocks away, gets a push notification: "Margaret's nighttime pattern suggests possible delirium onset. Calming environment activated. Check-in recommended within 30 minutes."

Tier 3 (Clinical escalation): If vitals don't stabilize in 45 minutes, or if movement sensors detect wandering, the system alerts Margaret's neurologist with a 4-hour data summary.

At 2:38 a.m., Margaret's heart rate settles. The environmental interventions appear to have worked. Sarah calls through the room speaker. Margaret responds, groggy but oriented. No fall. No ER visit. No fracture.

The next morning, the neurologist reviews a trend chart showing increased nocturnal episodes over two weeks. She adjusts evening medication timing and orders a urinalysis — exactly the kind of workup that was missed in Edith Harrison's case sixteen years earlier.

Clinical Note

This scenario is illustrative — no single system executes the full sequence today. But every component (vital sign monitoring, movement analysis, smart home integration, tiered alerting, trend visualization) exists in current prototypes or commercial devices. The integration is the engineering challenge. The clinical logic already exists.

Part XThe Gaps We Cannot Afford to Ignore

Validation Crisis

Of 86 AI models reviewed, only 2 had external validation. Three had model calibration. Without multi-site, prospective validation, impressive AUCs cannot translate to reliable clinical tools. Altoida remains the only AD-specific device with both FDA clearance and CE marking.

Data Privacy & Equity Gap

Who owns continuous health data? How is it governed under HIPAA, GDPR, and emerging regulations? Training datasets remain disproportionately white, Western, and well-resourced. Until diversity is achieved, generalizability remains in question.

The "So What" Problem

For a patient diagnosed with preclinical neurodegeneration today, actionable options remain limited. Early detection without early treatment risks creating people who know they're sick but can't yet be helped — a psychological burden the field hasn't addressed.

Caregiver burden: Every data stream creates monitoring obligations. Without clinical triage protocols, alerts become noise. Clinical workflow: A neurologist with 7 minutes per visit needs decision support tools that synthesize wearable data into actionable recommendations, not raw feeds. Without this, wearable data becomes sophisticated noise in an overburdened system.

Part XIWhere This Is Headed

🔮

Ultra-Early Screening

Composite panels — gait, speech, sleep, blood biomarkers — enabling screening decades before symptoms, through devices people already own.

🎯

Precision Therapeutics

AI synthesizing wearable data, genomic risk, and treatment response for individualized care plans that evolve with the patient.

💗

Emotionally Adaptive AI

Devices detecting distress via voice, HR, and movement, responding with personalized interventions — a familiar voice, a song, a vibration.

🏠

Ambient Intelligence

Smart homes with sensors in floors, furniture, walls. The wearable disappears. Monitoring without wearing or charging anything.

🧬

Multimodal Fusion

Neuroimaging + genetics + blood biomarkers + digital phenotyping into unified models capturing full disease complexity.

🧪

Wearables as Endpoints

Continuous real-world data replacing episodic clinic assessments as primary trial endpoints — already underway per Nature Reviews Drug Discovery.

On the Horizon — Advanced Materials

Self-healing, stretchable, breathable electronic materials advancing toward wearables indistinguishable from skin. E-tattoos and epidermal electronics — thin sensor patches applied directly to the body — combined with energy-harvesting (thermoelectric generators, sweat-powered biofuel cells) could operate indefinitely. The patient would not just forget they're wearing a sensor. They would be unable to tell.

Part XIIDesigning for Humanity

The wearable revolution will not be won by the most accurate algorithm or sensitive accelerometer. It will be won by teams — engineers, clinicians, designers, caregivers, and patients — who understand that technology must serve the person, not the other way around.

Karlawish's account of Edith Harrison is a reminder of what happens when the system doesn't see the patient. Inouye's work on delirium is a reminder that the most dangerous conditions are often the most invisible. Kaye's Mandala Model reminds us that the richest clinical insight sometimes comes not from a single test but from the quiet accumulation of daily patterns observed over time.

Wearable sensors are ultimately instruments of attention: they watch when no one else is watching, they listen when no one else is listening, and they remember when the patient cannot. Whether that attention translates into better care depends on the choices we make now.

Those choices: how we validate with the rigor these tools deserve, how we regulate without stifling, how we ensure they work for all patients, how we design for the people who need them most, and whether we prioritize dignity over data.

Over a century after Alois Alzheimer described the plaques and tangles that bear his name, the disease lacks a cure. Parkinson's, first characterized in 1817, still lacks a therapy that halts its progression. But for the first time, we have tools to see them coming — not in a scan three months away, but in the rhythm of a footstep, the cadence of a voice, the restlessness of a night's sleep.

What we do with that sight is the question that will define the next chapter. The answer will be written not just in code and silicon, but in the quality of attention we choose to pay to the people who need it most.

Sources & Further Reading

Karlawish, J. The Problem of Alzheimer's. St. Martin's Press, 2021. Penn Memory Center ↗
Cejudo et al. "AI and Wearables for Early Detection of Cognitive Impairment." JMIR 2026;28:e86262. JMIR ↗
Li et al. "AD digital biomarkers landscape and AI model scoping review." npj Digital Medicine, 2025. Nature ↗
"Wearable technologies in clinical trials." Nature Reviews Drug Discovery, Mar 2026. Nature ↗
Gramkow et al. "Digital biomarkers in early AD from wearable technology." medRxiv, May 2025. medRxiv ↗
Kourtis et al. "Digital biomarkers for AD: the mobile/wearable opportunity." npj Digital Medicine 2019;2:9. Nature ↗
Von Coelln et al. "Wearable Sensors & ML for PD vs. Parkinsonism." Sensors 2025. PMC ↗
Lott et al. "Digital Health Technologies for ADRD: Landscape Analysis." J Prev Alzheimers Dis, 2024. Springer ↗
"Digital biomarkers: Redefining clinical outcomes." Alzheimer's & Dementia, 2025. PMC ↗
Inouye, S.K. et al. "Delirium in elderly people." The Lancet, 2014. PMC ↗
Khan et al. "Wearable Solutions for PD and Neurocognitive Disorder." Sensors 2020;20(9):2713. MDPI ↗
Sullivan, S. Designing for Wearables. O'Reilly Media. O'Reilly ↗
LeMoyne et al. Wearable & Wireless Systems for Healthcare II. Barnes & Noble ↗
"Neurology Trends in 2026: AI & Wearables." FL Center for Neurology, Feb 2026. FCN ↗