Together, macroeconomic conditions and emerging technology are reshaping a profession that depends on time, trust, and reimbursement. Clinics report higher borrowing costs, more price-sensitive clients, and digital substitutes that are easy to access, while insurance reimbursement rates remain flat, according to coverage of the Heard survey.
These conditions require practices to protect the clinical relationship while updating their business models.
Some of these pressures are structural rather than temporary. In recent years interest rates have remained elevated, which increases borrowing costs for practices. Federal regulators have finalized long-term Medicare telehealth policies for behavioral and mental health, making home-based delivery a permanent option from 2026 and changing the mix of services.
At the same time, consumer chatbots promise continuous support but raise safety and privacy risks that may change how people distinguish between informal help and licensed care.
Therapy Practices Navigate Higher Costs and AI Regulation
- Inflation and interest rates have tightened margins for independent clinicians.
- Digital-health funding fell two-thirds from 2021, curbing easy capital.
- Medicare made home-based behavioral telehealth permanent starting in 2026.
- FTC launched a 2025 study of seven AI companion chatbot companies.
- APA ethics require AI to augment, not replace, professional judgment.
- Clinics respond with diversified services and administrative AI under safeguards.
Economic Strain on Independent Clinics
Financial stress shows up first in operating budgets. Solo or small-group clinicians said inflation drove up rent, software subscriptions, and malpractice insurance, according to a 2024 survey by Behavioral Health Business.
Most respondents chose not to raise self-pay fees, worried that clients would shorten treatment episodes or drop out. Cash rates and insurance reimbursements left little room to absorb higher overhead.
Capital has also become scarcer. Venture investment in U.S. digital-health companies fell from about 29 billion dollars in 2021 to roughly 10 billion dollars in 2024, according to Geodesic Capital. Lower fundraising values reach software vendors that once subsidized practice management tools.
Higher interest rates also limit inexpensive credit lines that many clinicians use to finance office expansions or electronic health-record upgrades. For owners who rely on borrowing to smooth cash flow or invest in technology, rate increases translate directly into tighter budgets.
Taken together, inflation, tight credit, and muted investment have pushed some owners to pause hiring or shift to lower-cost telehealth footprints. The financial logic now favors lean staffing and flexible leases instead of traditional multi-room office suites.
More Business Articles
Telehealth Enters a Permanent Policy Era
By 2024, telehealth had become a routine part of care. Twenty-five percent of Medicare fee-for-service beneficiaries used at least one virtual visit in 2024, a share unchanged since 2023, according to the U.S. Department of Health and Human Services Telehealth Research Trends portal.
A 2025 FAQ from the Centers for Medicare & Medicaid Services confirmed that, starting in 2026, practitioners may deliver psychotherapy to any Medicare patient at home. This includes sessions conducted via two-way audio-only devices under certain conditions, with no geographic restrictions.
Medicare requires safeguards and specific conditions for coverage. According to CMS, the policies are structured to increase flexibility while maintaining oversight, and the agency details documentation and visit requirements that differ from pre pandemic location rules.
Commentary on these changes suggests that home-based telehealth may help stabilize scheduling and lower missed-appointment risk for clients who face barriers such as long travel distances, childcare responsibilities, or limited internet bandwidth.
Some practices are exploring "telehealth-first" panels in which clinicians design schedules around virtual care as the default and reserve in-person visits for specific clinical needs.
Permanent audio-only coverage also supports phone-based check-ins for patients who struggle with video fatigue or disabilities. This option can widen access without significant new infrastructure, effectively allowing many private spaces to function as billable treatment settings when clinical and privacy standards are met.
AI Companion Chatbots Face Scrutiny
Parallel to formal therapy, consumer chatbots that present themselves as companions or coaches have proliferated. In 2025 the Federal Trade Commission opened a study by issuing compulsory 6(b) orders to seven companies including Alphabet, Character AI, Meta, OpenAI, Snap, Instagram, and xAI.
The agency's press release cited potential child harms and opaque data practices as reasons for the inquiry.
FTC chair Andrew N. Ferguson said the study would help the agency understand how AI firms are developing their products and what steps they are taking to protect children. The investigation relies on statutory powers that allow the FTC to demand internal records even when it has not alleged a formal violation.
The FTC described these tools as chatbots that act as companions, friends, or advisers. They operate outside traditional health-care relationships but can still receive sensitive disclosures about mood, relationships, or self-harm.
This prompts questions about how companies store and use that information.
Because many consumer chatbots do not qualify as covered entities or business associates under health privacy laws such as HIPAA, their handling of user data can follow different rules than clinical records. This raises concerns that information shared in seemingly private conversations could be reused for model training, personalization, or other purposes that would face stricter limits in health-care settings.
For brick-and-mortar therapy practices, the commercial rise of companion bots presents both potential competition and reputational risk. If some users begin to equate automated advice with professional counseling, negative experiences or safety failures in consumer products could influence public trust in evidence-based mental health care.
Ethical Guardrails for Clinical AI
The American Psychological Association moved to draw clearer boundaries in 2025, publishing "Ethical Guidance for AI in the Professional Practice of Health Service Psychology." The document, summarized by APA Services, states that clinicians remain responsible for all decisions and must treat algorithmic output as one input among many.
According to the guidance from APA, key expectations include HIPAA compliance, strong cybersecurity, and transparent patient consent when tools process protected health information. The document discourages blind reliance on "black-box" systems and calls for human monitoring of model errors or hallucinations.
“Historically, technology is something that psychologists have been rightly wary of. But AI is here. As a field, we need to join in, or we’re going to get left behind.”
Clinical use cases highlighted in professional discussions often include documentation assistance, outcome-measure scoring, and triage chat that routes inquiries to licensed staff. These deployments differ from direct-to-consumer chatbots because they sit inside the therapeutic workflow, with access controls and professional oversight.Even so, the legal environment remains unsettled. Many state licensing boards and malpractice insurers are still developing or clarifying their positions on AI-generated documentation, decision-support tools, and related risk coverage, according to professional commentary.
In practice, adherence to APA guidance functions as a basic risk-management standard for clinics that choose to integrate AI.
Practice-Level Adaptations
Faced with narrow margins, many clinicians are diversifying services. Survey data summarized by Behavioral Health Business show that a significant share of therapists supplement income through supervision, teaching, or consulting to buffer against appointment volatility.
Group programs and brief, protocol-driven workshops can convert therapist hours into formats that reach more people at once. Employers have become a growing referral channel as human-resources teams seek mental-health benefits that extend beyond crisis hotlines and one-to-one sessions.
Telehealth enables these models to reach clients in a wider geographic area within a clinician's licensed jurisdictions without new office leases. Practices can experiment with combinations of in-person intakes, ongoing virtual sessions, and periodic on-site visits tailored to clinical needs and patient preferences.
Administrative AI now underpins many hybrid models. Tools that transcribe sessions, draft progress notes, or flag missed outcome measures can reduce administrative time, allowing practitioners to reallocate more hours to billable care.
This is provided they document oversight, vendor due diligence, and patient consent where tools touch clinical records.
“It’s getting more expensive to be a therapist due to inflation, yet rates aren’t going up, so therapists are struggling.”
For clinics, this cost pressure means that efficiency gains from telehealth and AI matter only if they translate into sustainable workloads or differentiated services. Practices that adopt new tools without clear plans for scheduling, caseload management, and quality measurement may still find margins under strain.Forward Outlook
Digital-health investors continued to deploy capital through the third quarter of 2025, although at lower levels than during the pandemic peak, according to Rock Health. Analysts there interpret current volumes as a possible new funding floor rather than a return to 2021 exuberance.
As of late 2025, it remains unclear how the FTC companion-chatbot study will affect commercial marketing claims or product design. State licensing boards and malpractice carriers are still outlining their approaches to AI in clinical workflows, and professional organizations continue to update guidance.
Across these developments, the central difference remains accountable human care. Economic cycles may shift and algorithms may improve, but licensed professionals continue to hold legal and ethical responsibility for clinical decisions.
Practices that combine that accountability with carefully selected technology are better positioned to maintain stable care during future periods of volatility.
Sources
- Behavioral Health Business. "Private Therapy Practices Face Financial Strain Amid Economic Woes." Behavioral Health Business, 2024.
- Geodesic Capital. "Scaling Digital Health in 2025: Lessons from JPM Healthcare Conference." Geodesic Capital, 2025.
- "Telehealth Research Trends." U.S. Department of Health and Human Services, 2025.
- "Telehealth FAQ Calendar Year 2026." Centers for Medicare & Medicaid Services, 2025.
- "FTC Launches Inquiry into AI Chatbots Acting as Companions." Federal Trade Commission, 2025.
- APA Services. "Artificial Intelligence Is Reshaping How Psychologists Work." American Psychological Association, 2025.
- "Ethical Guidance for AI in the Professional Practice of Health Service Psychology." American Psychological Association, 2025.
- Federal Reserve Board. "Monetary Policy Report." Federal Reserve Board, 2026.
- Rock Health. "Q3 2025 Market Overview Signals Out of Sync." Rock Health, 2025.
