The American Psychiatric Association (APA) has updated its Privacy Policy and Terms of Use, including with new information specifically addressed to individuals in the European Economic Area. As described in the Privacy Policy and Terms of Use, this website utilizes cookies, including for the purpose of offering an optimal online experience and services tailored to your preferences.

Please read the entire Privacy Policy and Terms of Use. By closing this message, browsing this website, continuing the navigation, or otherwise continuing to use the APA's websites, you confirm that you understand and accept the terms of the Privacy Policy and Terms of Use, including the utilization of cookies.

×
Published Online:https://doi.org/10.1176/appi.ps.202100676

Abstract

Objective:

State mental health authorities (SMHAs) in all U.S. states and territories administer the Mental Health Block Grant (MHBG) set-aside funding for first-episode psychosis. Funds support implementation of coordinated specialty care (CSC) programs. The authors investigated the relationship between the level of SMHA involvement with CSC programs and clinical outcomes of clients in these programs.

Methods:

As part of a mixed-methods study of 34 CSC programs, SMHAs from 21 states and one U.S. territory associated with the 34 CSC programs participated in a 1-hour interview (between November 2018 and May 2019) focused on SMHA involvement in administration of MHBG set-aside funds and the SMHA’s ongoing relationship with funded CSC programs. SMHA involvement was rated on a scale of 1 to 5, with 5 indicating the highest involvement. Client outcome data were collected at the 34 study sites over an 18-month period. Multilevel random-effect modeling was used, controlling for response propensity (propensity score), client demographic variables, and program-level covariates (i.e., fidelity score, staff turnover rates, service area urbanicity, and number of clients enrolled).

Results:

Clients in CSC programs with SMHAs that were the most involved (level 5) had significantly improved symptoms, social functioning, and role functioning, compared with clients in programs with which SMHAs were least involved (level 1).

Conclusions:

The findings suggest that increased SMHA involvement in CSC programs is relevant for positive client outcomes. Levels of first-episode psychosis funding doubled in 2021 and 2022, and it is important to identify how SMHAs affect the success of CSC programs and the individuals served.

HIGHLIGHTS

  • Interviews with state mental health authorities (SMHAs) in 21 states and one U.S. territory showed differences in levels of SMHA involvement with coordinated specialty care (CSC) clinics receiving Mental Health Block Grant set-aside funds for first-episode psychosis.

  • SMHA involvement was characterized on a scale of 1 to 5 on the basis of frequency of communication with CSC program staff members and involvement with decision making regarding CSC model, staffing and training, and client enrollment criteria.

  • Higher levels of SMHA involvement at CSC clinics were associated with greater client improvement, such as reduction in symptoms and increased social and role functioning.

Coordinated specialty care (CSC) is a recovery-oriented, team-based treatment model that provides evidence-based services for people with first-episode psychosis (17). CSC seeks to intervene early and improve symptoms, social functioning, and quality of life (4, 5). In the United States, most CSC programs receive some funding through the Community Mental Health Services Block Grant (MHBG) set-aside program for first-episode psychosis (1, 8). The state mental health authority (SMHA) in each state or territory is responsible for administering the MHBG set-aside funds to CSC programs. SMHAs are state governmental agencies designed to ensure the delivery of services to individuals with mental health conditions (9). Typically, SMHAs coordinate and directly operate mental health services (e.g., direct psychiatric treatment and supports for housing, employment, and education) and allocate funds to community providers for mental health services not provided by the SMHA. SMHAs vary widely in how they are organized, staffed, and managed within state governments (810).

Since 2014, SMHAs have administered the MHBG set-aside funds for first-episode psychosis (8, 11). In fiscal years 2014 and 2015, Congress directed each state to set aside 5% of its MHBG dollars to support evidence-based programs for individuals with serious mental illness, including psychotic disorders. The Substance Abuse and Mental Health Services Administration (SAMHSA), which administers the MHBG, provided guidance to states to encourage a focus on effective treatment for early psychosis, with particular emphasis on CSC (1). In 2016–2020, SAMHSA directed states to increase their investment in CSC by setting aside 10% of their MHBG funds. The law that funds the MHBG was amended to provide states with a supplement to cover the cost of the required 10% set-aside (9).

SMHAs take a variety of approaches to administering the MHBG set-aside funds. For example, SMHAs in Maryland, New York, Ohio, and Oregon work closely with academic institutions to implement CSC programs (8, 12). In contrast, other states contract with third parties to manage the distribution of funding and oversight (13). In 2015, a study of 12 state administrators of the MHBG set-aside funds found differences in how involved and prescriptive SMHAs were in determining the CSC model selected, the programs’ target populations, training needs, and use of funds (10); however, strong state guidance remained one of the facilitating factors for implementing CSC via use of the set-aside funds (10).

The literature on mental health systems change documents the importance of SMHA involvement for implementing evidence-based practices (EBPs) (1419). When SMHAs administer funding but have little involvement in the implementation process, broad adoption of the practice is often limited (1619). When SMHAs do not actively promote or engage in decision making around implementing EBPs, recruiting participating clinics is often more difficult and clinic staff turnover is higher (20). However, little is known about how state-level involvement is related to individual client outcomes. Specifically for CSC, it remains unclear whether the level of SMHA involvement with the set-aside–funded CSC programs predicts the success of these programs or plays a role in the lives of clients served.

States received funding increases for CSC in 2021 and 2022 through the COVID-19 Relief Supplement Allotment (21) and American Rescue Plan Supplement (22). With this increased investment in CSC, it is important to examine whether SMHA involvement plays a role in CSC program effectiveness. The purpose of this study was to investigate whether a relationship exists between the level of SMHA involvement with CSC programs and clinical outcomes of clients served within these programs.

Methods

Sample

We collected data from 34 CSC programs included in a larger study examining the use of MHBG set-aside funds for early psychosis care. These CSC programs were selected from a pool of 244 CSC programs identified by SMHAs in their 2017 federal reporting on the distribution of MHBG set-aside funds. From that pool, 10 senior researchers selected 40 potential CSC programs that were actively providing CSC services and that represented diversity in geographic distribution, had a CSC program model (e.g., OnTrackNY, EASA, NAVIGATE), received technical assistance at program start-up, and had an urban-rural mix. Study team members contacted these 40 programs, 34 of which agreed to participate in the study. The 34 study sites were located across 21 U.S. states and one U.S. territory (see Figure 1), and SMHAs of these states and territory were included in the study. All aspects of this study were reviewed and approved by the Westat Research Ethics Board.

FIGURE 1.

FIGURE 1. Coordinated specialty care (CSC) study sites (N=34) in the 21 states and one U.S. territory represented in this studya

aNumbers in each state or territory indicate number of CSC sites.

Data Collection

Client outcomes.

Staff members at the 34 study sites collected outcome data on clients receiving CSC services on entry into the program and every 6 months, for up to 18 months. Outcome data collection ran from January 2018 to July 2019. Demographic and background information included age, gender, race-ethnicity, income, duration of untreated psychosis before the person entered the program, and start of CSC program enrollment for each client. Sites submitted deidentified data through a secure online portal.

Outcome measures included the Modified Colorado Symptom Index (CSI) (23, 24), a 14-item client self-report scale rating the frequency of experiencing psychiatric symptoms in the past month from 0, not at all, to 4, at least every day, with good internal consistency (0.92) and test-retest reliability (0.71) (23); the Terrible to Delighted Quality of Life Scale (25), a one-item self-report rating of how clients feel about their life generally on a scale ranging from 1, terrible, to 10, delighted; Global Functioning: Social Scale (26, 27), a one-item clinician-completed measure regarding a client’s social contact and interactions in the past month ranked on a scale ranging from 1, extreme social isolation, to 10, superior social and interpersonal functioning; and the Global Functioning: Role Scale (26, 27), a one-item clinician-completed measure regarding a client’s functioning in occupational, educational, or homemaker roles in the past month, scored on a scale ranging from 1, extreme role dysfunction, to 10, superior role functioning. The functioning scales have good interrater reliability (intraclass correlation coefficients of 0.85–0.95) and evidence of construct and discriminant validity (26).

SMHA staff interviews.

Between November 2018 and May 2019, senior members of the study team (S.S.G. and a nonauthor team member) conducted 1-hour, semistructured telephone interviews with the 22 SMHAs responsible for administering MHBG set-aside funding to the 34 study sites. One to five SMHA staff members selected by the SMHA participated in the interviews. After receiving a description of the study, respondents gave verbal informed consent for participation. Discussion included the process for administering MHBG set-aside funds, selection of CSC programs, and SMHA involvement with programs. Interviews were not limited to the role of the SMHA with the 34 study sites and included discussion of SMHA involvement with all MHBG-funded CSC programs within the state.

To develop an indicator of SMHA involvement with CSC programs, we used data from SMHA responses to five questions selected to reflect two main dimensions of involvement: level of ongoing communication with CSC programs and degree of state-level control regarding the way the programs operate, including hiring decisions, CSC program model selection, and admission criteria. We selected these items on the basis of existing work (16, 17) and consultation with experts who work with both the SMHAs and the CSC programs. An important factor for item selection was that federal guidance to states on CSC implementation did not include specific recommendations regarding these areas (10, 28). As such, these five questions revealed variability among SMHAs (see Results). For each “yes” response, the SMHA received 1 point, for a maximum of 5 points. Possible scores ranged from 1 to 5, with higher scores indicating a higher degree of involvement with CSC programs. A senior member of the study team (T.C.D.), who did not participate in the telephone interviews, rated the SMHA responses.

Analyses

We used multilevel random-effect modeling to examine whether the level of SMHA involvement was associated with client outcomes. For each of the outcome measures, only clients who completed a baseline assessment and at least one follow-up assessment were included in the analyses (N=362–435 clients, depending on the measure). To account for the potential effect of attrition from the two groups of study participants (those with only baseline assessment vs. those with follow-up data), a propensity score (PS) approach (29) was used to create a response PS for each participant. We estimated the probability of being in the group with follow-up data on the basis of the available participant-level and site-level core covariates. These covariates included length of time between interviews, gender, race-ethnicity, income, age, duration of untreated psychosis, program fidelity score (30), staff turnover rate, urbanicity and poverty level of the service area, number of participants within sites, and the site-level compositions of participants in terms of race-ethnicity, gender, and income. The PS were included in the analyses to control for the potential association of missing the individuals who had only a baseline score.

The outcome measures were the change scores for the CSI (23, 24), Terrible to Delighted Quality of Life Scale (25), and Global Functioning: Social and Role Scales (26, 27). The models controlled for baseline assessment scores, the PS, client-level covariates (demographic variables from the baseline assessment, including duration of treatment, gender, race-ethnicity, income status, age, and duration of untreated psychosis), and site-level covariates (CSC fidelity score, turnover rate, urbanicity and poverty level of the service area, number of participants within sites, and the site-level compositions of participants in terms of race-ethnicity, gender, and income). The model’s level 1 included client-level covariates, and level 2 included site-level covariates. The models allowed intercepts to vary across the sites.

Results

The indicators of SMHA involvement with CSC programs exhibited variability (Table 1). SMHA involvement ranged from 1 to 5 (mean=2.9). Table 2 presents client change scores, in terms of average change (difference between final assessment score and baseline assessment score), for each level of SMHA involvement. In general, the higher the level of SMHA involvement, the larger was the magnitude of the change scores (reflecting positive change).

TABLE 1. Responses of SMHA personnel (N=22) to questions about their SMHA’s involvement with CSC programs in their state or territory funded by Community Mental Health Services Block Grant set-aside fundsa

YesNo
QuestionN%N%
1. Does the SMHA communicate with CSC clinics at least once per month?1568732
2. Does the SMHA take an active role in the hiring and training of CSC staff?7321568
3. Does the SMHA determine the model (e.g., NAVIGATE, EASA, OnTrackNY, etc.) used by CSC clinics?1359941
4. Does the SMHA work with clinics to determine the age range of clients entering the CSC program?1673627
5. Does the SMHA work with clinics to determine the duration of untreated psychosis as an eligibility criterion for clients?12551045

aResponses to the questions were used in rating state mental health authority (SMHA) involvement; for each “yes” response, a SMHA received 1 point, for a maximum of 5 points, with higher scores indicating a higher involvement with coordinated specialty care (CSC) programs.

TABLE 1. Responses of SMHA personnel (N=22) to questions about their SMHA’s involvement with CSC programs in their state or territory funded by Community Mental Health Services Block Grant set-aside fundsa

Enlarge table

TABLE 2. Change scores of clients in coordinated specialty care (CSC) programs, by level of state mental health authority (SMHA) involvement with these programs

N of states or territories and measurement instrumentSMHA level of involvementa
12345
N of states or territories (N=22)46534
Colorado Symptom Index change scoreb−6.9−7.5−8.6−10.5−10.3
Terrible to Delighted Quality of Life Scale change scorec.39.95.90.621.30
Global Functioning: Social Scale change scorec.93.661.27.831.90
Global Functioning: Role Scale change scorec1.071.391.57.732.18

aLevels of involvement range from 1 to 5, with higher scores indicating a higher degree of involvement with CSC programs.

bThe larger the negative number, the greater the reduction in symptoms.

cHigher change scores indicate greater improvement.

TABLE 2. Change scores of clients in coordinated specialty care (CSC) programs, by level of state mental health authority (SMHA) involvement with these programs

Enlarge table

Table 3 lists the regression coefficients of the SMHA score for the levels of involvement. The model’s first level (score of 1) was used as the reference group for the four models. The regression coefficients of the control variables for each model are not shown. Each of the four groups (SMHA scores of 2–5) were compared with SMHAs that scored 1 on the involvement scale. As shown in the table, clients in CSC programs with SMHAs that were the most involved (level 5) had significantly lower final assessment CSI scores, compared with clients at clinics with SMHA level 1 involvement (β=−3.65, p=0.036), indicating a larger reduction in symptoms for clients at clinics with greater SMHA involvement. Similarly, clients at clinics with SMHA level 5 involvement had significantly higher Global Functioning: Social and Role Scale scores, compared with clients at clinics with SMHA level 1 involvement (βsocial=1.32, p<0.001; βrole=1.13, p=0.019), indicating larger improvements in social and role functioning of clients served at clinics with greater SMHA involvement.

TABLE 3. Results of regression analyses of the relationship between SMHA involvement and client outcomes

SMHA level of involvementa
Measure2345
CSI change score−1.20−.76−3.51−3.65*
Terrible to Delighted Quality of Life Scale change score.55.36.31.49
Global Functioning: Social Scale change score.32.34.311.32***
Global Functioning: Role Scale change score.48.27.011.13*

aValues are regression coefficients of multilevel, random-effect modeling for each of the client outcome measures. Levels of involvement ranged from 1 to 5, with higher scores indicating a higher degree of involvement with coordinated specialty care programs. State mental health authorities (SMHAs) scoring at level 1 were used as the reference group for the four groups. Lower Colorado Symptom Index (CSI) scores indicate a larger reduction in symptoms. Higher scores on the other scales indicate larger improvements in quality of life or functioning.

*p<0.05, ***p<0.001.

TABLE 3. Results of regression analyses of the relationship between SMHA involvement and client outcomes

Enlarge table

Even when the analysis controlled for demographic variables and PS, the most involved SMHAs, scoring a 5, were associated with clients who had the largest improvements in terms of symptoms, social functioning, and role functioning. For the quality-of-life measure, clients at clinics with SMHA level 2 involvement were also found to have higher scores than clients at clinics with SMHA level 1 involvement, although the difference showed only a trend toward statistical significance (β=0.55, p=0.056).

Discussion

Our results are consistent with those of previous studies (1620, 3133) and emphasize the importance of SMHA involvement for implementing EBPs. In our study, SMHA involvement was characterized as frequent communication with CSC programs, involvement in hiring and training of CSC staff, and involvement in decision making about CSC program delivery. Increased SMHA involvement predicted positive client outcomes in CSC programs. This trend was true for multiple client outcomes, including improved symptoms, role functioning, social functioning, and quality of life, although improvement in quality of life showed only a trend toward significance.

Previous studies have shown that SMHAs with more involvement and stronger relationships with providers and clinics experience more success in statewide implementation of EBPs (17, 18, 20, 3133) and sustainability of EBPs (18). A study of eight SMHAs and their role in implementing EBPs found that the most successful states had SMHAs that not only understood the components of services being offered but also directed staff members on how to implement service components (17).

Unlike in other situations where SMHAs initiate and implement EBPs within their state to address a particular need, in this study every SMHA was required to administer the funding for the MHBG set-aside. One could argue that the administration of the MHBG set-aside funding was thrust upon SMHAs, and some knew little about first-episode psychosis. Additionally, CSC itself is relatively new in the United States, compared with other countries (1, 28). The earliest U.S. programs date back to 2000 (28). CSC models and CSC service components vary across the United States. Although states such as New York, Oregon, and Washington have adopted a single programmatic model for implementing CSC, most states have not (28). In a recent study of CSC implementation in California, programs across 58 counties implemented different versions of CSC, with varying service components, and in some cases, programs differed within the same county (34). In a study of 12 states, Horvitz-Lennon and colleagues (10) noted that although the MHBG set-aside funding for CSC was part of a legislative initiative, SMHAs did not have guidance about how much programmatic involvement they should have, which strategies they should use to implement CSC, or how they should administer the funds. As such, states varied widely in terms of their involvement in prescribing CSC implementation (10).

Despite the diversity among CSC programs and state contextual factors, in our sample of 34 programs across 21 states and one U.S. territory, greater SMHA involvement predicted better client outcomes. This trend held true even after the analysis was controlled for important predictors, such as baseline assessment scores, client-level demographic variables, and site-level covariates. We note that we do not believe that the relationship between SMHA involvement and client outcomes is linear, nor does the study support a causal hypothesis. Indeed, moderate levels of SMHA involvement did not necessarily predict better client outcomes, compared with low levels of SMHA involvement. Only the highest ratings of SMHA involvement predicted positive client outcomes.

Many alternative hypotheses might explain our findings because we could not measure all the potentially relevant features of the participating sites. For example, SMHAs that are most involved may specify stricter enrollment criteria to their CSC study sites, which may lead to better outcomes because the programs have fewer clients with co-occurring and complicating conditions. Another factor may be the overall CSC program budgets, because funding may be a powerful predictor of client outcomes. We were not able to assess either of these factors.

Another possible explanation for the findings may be that states with a long history of CSC implementation may have more involved SMHAs. In our study, nine CSC programs across five states reported that they provided CSC services before the availability of the 2014 MHBG set-aside funding. In our small sample, having an older program within the state did not indicate that the SMHA was more involved, and among the five states with programs initiated before 2014, only one SMHA received the highest rating of 5. One SMHA received a score of 3, one received a score of 4, and two received a score of 2.

Additionally, the involvement of a CSC Resource Center or of academic institutions with CSC expertise might explain the study findings. Although we did not directly test this hypothesis, seven SMHAs reported that they worked closely with a third party (a CSC Resource Center, academic institution, or managed care company) that managed and trained CSC staff members. Only one of these SMHAs received a rating of 5 (highest), one received a rating of 4, two received a rating of 3, two received a rating of 2, and one received a rating of 1. We did not have enough information to fully explore how this relationship may have influenced outcomes.

These findings are exploratory, because the study had several limitations. The CSC programs included in our study were chosen, in part, because of their reputation for providing good CSC care, which could have led to a biased estimate of the effect of SMHA involvement. However, enough variability existed in the data on outcomes and involvement to reveal a relationship. Our sample may not be representative of all CSC programs in the United States. Also, the inclusion of select CSC programs within each state and not the full array of CSC programs in the state may have limited the range of client outcomes, because some programs included in this study may have had better outcomes than other programs in the state.

Although we controlled for several site-level variables, the nature of the study and the number of study sites limited our ability to assess other variables that might be equally indicative of SMHA involvement or predictive of outcomes. For example, SMHA knowledge about and experience with CSC may explain how involved SMHAs are with CSC programs; however, we did not assess this variable. Additionally, we note that only one researcher on the study team rated the presence or absence of each factor by using interview transcripts; thus, these assessments could have been biased. Finally, follow-up information was not included in the analyses to further understand processes. For example, seven (32%) of the SMHAs reported involvement in the hiring of CSC staff members, but we did not have information about the extent of their involvement (e.g., reviewing resumes or participating in interviews). Additional detail could have helped us understand which SMHA activities drove the findings.

Conclusions

This study provides suggestive evidence about the important role that SMHAs can play in improving the implementation of CSC and in supporting outcomes for service users. The findings support previous studies that emphasize the power of SMHAs in “championing” EBP implementation within states. With the recent increase in SMHA-administered funding for CSC programs, it will be important for SMHAs to recognize the potential role they have in supporting clinical improvements in clients’ lives.

Westat, Rockville, Maryland (Ghose, George, Ren, Zhu, Rosenblatt); Department of Psychiatry, School of Medicine, University of Maryland, Baltimore (Goldman); Abt Associates, Durham, North Carolina (Daley); New York State Psychiatric Institute and Department of Psychiatry, Columbia University Vagelos College of Physicians and Surgeons, New York City (Dixon).
Send correspondence to Dr. Ghose (). Dr. Dixon is the Editor of the journal. Marvin S. Swartz, M.D., served as decision editor on the manuscript.

The Mental Health Block Grant 10% Study was supported by the Substance Abuse and Mental Health Services Administration and the National Institute of Mental Health (task order HHSS283201200011I/HHSS28342008T, reference no. 283-12-1108).

The authors report no financial relationships with commercial interests.

References

1. Heinssen RK, Goldstein AB, Azrin ST: Evidence-Based Treatments for First Episode Psychosis: Components of Coordinated Specialty Care. Bethesda, MD, National Institute of Mental Health, 2014. https://www.nimh.nih.gov/health/topics/schizophrenia/raise/evidence-based-treatments-for-first-episode-psychosis-components-of-coordinated-specialty-care. Accessed Jul 18, 2022 Google Scholar

2. Correll CU, Galling B, Pawar A, et al.: Comparison of early intervention services vs treatment as usual for early-phase psychosis: a systematic review, meta-analysis, and meta-regression. JAMA Psychiatry 2018; 75:555–565Crossref, MedlineGoogle Scholar

3. Dixon LB, Goldman HH, Bennett ME, et al.: Implementing coordinated specialty care for early psychosis: the RAISE Connection Program. Psychiatr Serv 2015; 66:691–698LinkGoogle Scholar

4. Dixon LB, Goldman HH, Srihari VH, et al.: Transforming the treatment of schizophrenia in the United States: the RAISE initiative. Annu Rev Clin Psychol 2018; 14:237–258Crossref, MedlineGoogle Scholar

5. Kane JM, Robinson DG, Schooler NR, et al.: Comprehensive versus usual community care for first-episode psychosis: 2-year outcomes from the NIMH RAISE Early Treatment Program. Am J Psychiatry 2016; 173:362–372LinkGoogle Scholar

6. Penn DL, Waldheter EJ, Perkins DO, et al.: Psychosocial treatment for first-episode psychosis: a research update. Am J Psychiatry 2005; 162:2220–2232LinkGoogle Scholar

7. Randall JR, Vokey S, Loewen H, et al.: A systematic review of the effect of early interventions for psychosis on the usage of inpatient services. Schizophr Bull 2015; 41:1379–1386Crossref, MedlineGoogle Scholar

8. Lutterman T, Kazandjian M, Urff J: Snapshot of State Plans for Using the Community Mental Health Block Grant 10 Percent Set-Aside to Address First Episode Psychosis. Alexandria, VA, National Association of State Mental Health Program Directors, 2018. https://www.nasmhpd.org/sites/default/files/Snapshot_of_State_Plans.pdf Google Scholar

9. Funding and Characteristics of Single State Agencies for Substance Abuse Services and State Mental Health Agencies, 2015. Rockville, MD, Substance Abuse and Mental Health Services Administration, 2017. https://store.samhsa.gov/product/Funding-and-Characteristics-of-Single-State-Agencies-for-Substance-Abuse-Services-and-State-Mental-Health-Agencies-2015/SMA17-5029. Accessed Aug 11, 2022 Google Scholar

10. Horvitz-Lennon M, Breslau J, Scharf D, et al.: Case Study: Early Assessment of the Mental Health Block Grant Set-Aside Program for Addressing First Episode Psychosis and Other Serious Mental Illness. Washington, DC, US Department of Health and Human Services, Office of the Assistant Secretary for Planning and Evaluation, 2015. https://aspe.hhs.gov/system/files/pdf/205071/MHBGsetaside.pdf Google Scholar

11. Block Grant Reporting Section FY 2014—CFDA 93.958 (Mental Health). Rockville, MD, Substance Abuse and Mental Health Services Administration, 2015. https://www.samhsa.gov/sites/default/files/grants/fy14-15-block-grant-reporting.pdf Google Scholar

12. Salerno A, Dixon LB, Myers RW, et al.: A public-academic partnership to support a state mental health authority’s strategic planning and policy decisions. Psychiatr Serv 2011; 62:1413–1415LinkGoogle Scholar

13. Shern D: Financing Coordinated Specialty Care for First Episode Psychosis: A Clinician/Advocate’s Guide. Washington, DC, American Psychiatric Association, 2020. https://smiadviser.org/wp-content/uploads/2020/08/CSC-Issue-Brief_v2-081320.pdf Google Scholar

14. Bickman L, Lambert EW, Andrade AR, et al.: The Fort Bragg continuum of care for children and adolescents: mental health outcomes over 5 years. J Consult Clin Psychol 2000; 68:710–716Crossref, MedlineGoogle Scholar

15. Goldman HH, Morrissey JP, Ridgely MS: Evaluating the Robert Wood Johnson Foundation Program on chronic mental illness. Milbank Q 1994; 72:37–47Crossref, MedlineGoogle Scholar

16. Isett KR, Burnam MA, Coleman-Beattie B, et al.: The state policy context of implementation issues for evidence-based practices in mental health. Psychiatr Serv 2007; 58:914–921LinkGoogle Scholar

17. Isett KR, Burnam MA, Coleman-Beattie B, et al.: The role of state mental health authorities in managing change for the implementation of evidence-based practices. Community Ment Health J 2008; 44:195–211Crossref, MedlineGoogle Scholar

18. Jones AM, Bond GR, Peterson AE, et al.: Role of state mental health leaders in supporting evidence-based practices over time. J Behav Health Serv Res 2014; 41:347–355Crossref, MedlineGoogle Scholar

19. Tessler RC, Bernstein AG, Rosen BM, et al.: The chronically mentally ill in community support systems. Hosp Community Psychiatry 1982; 33:208–211AbstractGoogle Scholar

20. Mancini AD, Moser LL, Whitley R, et al.: Assertive community treatment: facilitators and barriers to implementation in routine mental health settings. Psychiatr Serv 2009; 60:189–195LinkGoogle Scholar

21. Letter to Single State Authority Directors and State Mental Health Commissioners. Rockville, MD, Substance Abuse and Mental Health Services Administration, 2021. https://oasas.ny.gov/system/files/documents/2021/05/samhsa_guidance_arp_funding.pdf Google Scholar

22. FY 2021 Community Mental Health Block Grant Program American Rescue Plan Supplemental Awards. Rockville, MD, Substance Abuse and Mental Health Services Administration, 2021. https://www.samhsa.gov/grants/block-grants/mhbg-american-rescue-plan. Accessed Jul 18, 2022 Google Scholar

23. Boothroyd RA, Chen HJ: The psychometric properties of the Colorado Symptom Index. Adm Policy Ment Health 2008; 35:370–378Crossref, MedlineGoogle Scholar

24. Shern DL, Wilson NZ, Coen AS, et al.: Client outcomes: II. Longitudinal client data from the Colorado Treatment Outcome Study. Milbank Q 1994; 72:123–148Crossref, MedlineGoogle Scholar

25. Lehman AF: A quality of life interview for the chronically mentally ill. Eval Program Plann 1988; 11:51–60 CrossrefGoogle Scholar

26. Cornblatt BA, Auther AM, Niendam T, et al.: Preliminary findings for two new measures of social and role functioning in the prodromal phase of schizophrenia. Schizophr Bull 2007; 33:688–702Crossref, MedlineGoogle Scholar

27. Piskulic D, Addington J, Auther A, et al.: Using the Global Functioning Social and Role Scales in a first-episode sample. Early Interv Psychiatry 2011; 5:219–223Crossref, MedlineGoogle Scholar

28. Read H, Kohrt B: The history of coordinated specialty care for early intervention in psychosis in the United States: a review of effectiveness, implementation, and fidelity. Community Ment Health J 2022; 58:835–846Crossref, MedlineGoogle Scholar

29. Rosenbaum PR, Rubin DB: The central role of the propensity score in observational studies for causal effects. Biometrika 1983; 70:41–55 CrossrefGoogle Scholar

30. Addington D, Noel V, Landers M, et al.: Reliability and feasibility of the First-Episode Psychosis Services Fidelity Scale–Revised for remote assessment. Psychiatr Serv 2020; 71:1245–1251LinkGoogle Scholar

31. Cook JA, Ruggiero K, Shore S, et al.: Public-academic collaboration in the application of evidence-based practice in Texas mental health system redesign. Int J Ment Health 2007; 36:36–49 CrossrefGoogle Scholar

32. Drake RE, Bond GR, Essock SM: Implementing evidence-based practices for people with schizophrenia. Schizophr Bull 2009; 35:704–713Crossref, MedlineGoogle Scholar

33. Finnerty MT, Rapp CA, Bond GR, et al.: The State Health Authority Yardstick (SHAY). Community Ment Health J 2009; 45:228–236Crossref, MedlineGoogle Scholar

34. Niendam TA, Sardo A, Savill M, et al.: The rise of early psychosis care in California: an overview of community and university-based services. Psychiatr Serv 2019; 70:480–487LinkGoogle Scholar