Translate

Πέμπτη 3 Οκτωβρίου 2019

What’s New in Hip Replacement
No abstract available
Prevalence of Clinical Depression Among Patients After Shoulder Stabilization: A Prospective Study
imageBackground: Depression is a potential risk factor for poor postoperative outcomes. This study aimed to identify the prevalence of clinical depression symptoms before and after shoulder stabilization, as well as the relationship between depression and functional outcomes. Methods: Patients undergoing arthroscopic primary glenohumeral stabilization for recurrent instability were eligible for enrollment. Participants completed the Quick Inventory of Depressive Symptomatology-Self Report (QIDS-SR) and the Western Ontario Shoulder Instability Index (WOSI) questionnaire preoperatively and at 6 weeks, 3 months, 6 months, and 1 year postoperatively. Patients with a preoperative QIDS-SR score of ≥6 were assigned to the clinical depression group. Results: Seventy-six patients were enrolled and were prospectively followed during this study. Thirty-nine patients were stratified into the clinical depression group. Preoperatively, the clinical depression cohort had worse WOSI scores than the cohort without clinical depression (mean difference, 8.3% [95% confidence interval (CI), 0.5% to 16.1%]; p = 0.04). Both the clinical depression cohort and the cohort without clinical depression displayed an improvement in WOSI scores at 1 year postoperatively (p < 0.01 for both cohorts). Both the clinical depression cohort and the cohort without clinical depression displayed an improvement in QIDS-SR scores at 1 year postoperatively (p < 0.01 for both cohorts). At 1 year postoperatively, the clinical depression cohort continued to have worse WOSI scores than the cohort without clinical depression (mean difference, 12.2% [95% CI, 5.9% to 18.5%]; p < 0.01) and worse QIDS-SR scores; the median QIDS-SR score was 5.0 points (interquartile range [IQR], 2.0 to 8.0 points) for the clinical depression group and 0.0 points (IQR, 0.0 to 3.0 points) for the group without clinical depression (p < 0.01). The postoperative prevalence of clinical depression (24%) was lower than the preoperative prevalence (51%) (p < 0.01). Increasing patient age was associated with preoperative depression symptoms (odds ratio, 3.1; p = 0.03). Conclusions: Fifty-one percent of patients with shoulder instability reported depression symptoms before the surgical procedure. Surgical intervention improved shoulder function and depression symptoms over time; however, the clinical depression cohort had worse postoperative shoulder and depression outcomes. Level of Evidence: Prognostic Level II. See Instructions for Authors for a complete description of levels of evidence.
Long-Term Results of Patellar Bone-Grafting for Severe Patellar Bone Loss During Revision Total Knee Arthroplasty
imageBackground: There is no consensus on managing severe patellar bone loss after total knee arthroplasty. We previously described an initial series involving a novel technique of patellar bone-grafting with a short follow-up. The purpose of this study was to determine long-term survivorship and the radiographic and clinical results of patellar bone-grafting during revision total knee arthroplasty in a larger series with an extended follow-up. Methods: We identified 90 patients from a single institution who underwent 93 patellar bone-grafting procedures for severe patellar bone loss from 1997 to 2014. The mean age of the patients was 70 years, and 46% of patients were female. Forty-five knees (48%) underwent first-time revisions, and 19 knees (20%) had undergone a failed attempt at patellar resurfacings. Intraoperative patellar caliper thickness increased from a mean of 7 to 25 mm after patellar bone-grafting (p < 0.01). Radiographic review determined changes in patellar height, tracking, and remodeling. Knee Society scores (KSSs) were calculated. The mean follow-up was 8 years (range, 2 to 18 years). Kaplan-Meier methods determined survivorship free of any revision and any reoperation. Cox proportional hazards analysis determined predictive factors for failure. Results: Survivorship free of patellar revision was 96% at 10 years. Survivorship free of any revision was 84% at 10 years. Survivorship free of any reoperation was 78% at 10 years. Increasing patient age was the only protective factor against further patellar revision (hazard ratio, 0.95; p < 0.01). When comparing initial radiographs with final radiographs, patellar height decreased from 22 to 19 mm (p < 0.01), 80% compared with 59% of patellae articulated centrally in the trochlea (p = 0.01), and 32% compared with 77% had remodeling over the lateral femoral condyle (p < 0.01). Anterior knee pain decreased from 51% to 27% postoperatively (p = 0.01). The mean knee flexion improved from 101° to 108° (p = 0.03). The mean KSS improved from 50 to 85 points (p < 0.01). Conclusions: Reliable long-term clinical results can be expected with patellar bone-grafting for severe patellar bone loss during revision total knee arthroplasty. Pain, range of motion, and other reported outcomes improve despite radiographic changes to patellar height, tracking, and remodeling. This technique is a durable and reliable option when standard patellar resurfacing is not possible. Level of Evidence: Therapeutic Level IV. See Instructions for Authors for a complete description of levels of evidence.
Decision Support Strategies for Hip and Knee Osteoarthritis: Less Is More: A Randomized Comparative Effectiveness Trial (DECIDE-OA Study)
imageBackground: As guidelines and payers increasingly recommend use of patient decision aids (DAs), evidence about the comparative effectiveness of available DAs is critical for organizations interested in implementing them. The primary purpose of this study was to compare 2 DAs with regard to their ability to help patients become informed and receive their preferred treatment (that is, make an informed patient-centered decision), shared decision-making, surgical rates, and surgeon satisfaction. Methods: We performed a multisite factorial randomized trial enrolling patients with hip or knee osteoarthritis. Patients were randomly assigned to use a long, detailed DA (long DA) or short, interactive DA (short DA). Eight surgeons were randomly assigned to receive a patient preference report detailing the patient’s goals and treatment preferences or to administer usual care. Results: We distributed 1,636 pre-visit surveys, 1,220 of which were returned (75% response rate), and 1,124 post-visit surveys, 967 of which were returned (86% response rate). The patients in the sample had a mean age (and standard deviation) of 65 ± 10 years, 57% were female, 89% were white non-Hispanic, and 67% had knee osteoarthritis. The majority (67.2%) made informed patient-centered decisions, and the rate did not vary significantly between the DA groups (p = 0.97) or between the surgeon groups (p = 0.23). Knowledge scores were higher for the short-DA group (mean difference = 9%; p < 0.001). More than half of the sample (60.5%) had surgery within 6 months after the visit, and rates did not differ significantly by DA or surgeon group. Overall, the surgeons were highly satisfied and reported that the majority (88.7%) of the visits were of normal duration or shorter. Conclusions: The DECIDE-OA study is, to our knowledge, the first randomized comparative effectiveness study of 2 orthopaedic DAs. The short DA outperformed the long DA with regard to knowledge scores and was comparable with respect to other outcomes. The surgeons reported high satisfaction and normal visit duration with both DAs. Clinical Relevance: Surgeons need to ensure that patients with osteoarthritis are well-informed and have a clear preference regarding whether to undergo hip or knee replacement surgery. The DAs used in this study may help surgeons involve patients in elective surgery decisions and meet the requirements of informed consent.
Outcomes of 188 Proximal Humeral Fractures Treated with a Dedicated External Fixator with Follow-up Ranging from 2 to 12 Years
imageBackground: The treatment of a displaced proximal humeral fracture is still a matter of controversy. Minimally invasive techniques are considered promising options. The purpose of this study was to report outcomes at medium to long-term follow-up after surgical treatment with pins stabilized with an external fixator. Methods: A total of 235 patients (average age, 64 years [95% confidence interval (CI), 62 to 65 years]) were treated with closed or open reduction and fixation with pins stabilized by an external fixator specifically designed for proximal humeral fractures. The pins were inserted using a “pins-crossing-fracture” or a “pins-bridging-fracture” technique. One hundred and eighty-eight patients had a minimum radiographic and clinical follow-up of 2 years. Outcomes were assessed using the Oxford Shoulder Score (OSS), the subjective shoulder value (SSV), a visual analog scale (VAS) for pain, and, for 155 patients, the Constant score. Results: Eighty-one (43%) of the 188 patients had a 2-part fracture, 60 (32%) had a 3-part fracture, and 47 (25%) had a 4-part fracture. The reduction was performed with percutaneous maneuvers in 120 shoulders or a deltopectoral approach, in 68. The external fixator was applied using a “pins-crossing-fracture” technique in 133 shoulders and using a “pins-bridging-fracture” technique in 55. At last follow-up, mean clinical scores were as follows: OSS, 42.6 (95% CI, 42 to 44); SSV, 85.5 (95% CI, 83 to 88); and VAS for pain, 1 (95% CI, 0.7 to 1.2). The complication rate at 3 months was 16% (37 of 235). The most frequent complication was pin-track infection (19 of 235, 8%). A total of 50 patients had ≥1 complication (50 of 188, 27%) and 6 (3%) underwent revision surgery. More complications were observed with the “pins-crossing-fracture” technique. Conclusions: In our experience, the use of the external fixator has been a valuable option in the treatment of proximal humeral fractures. The complication and revision rates were acceptable. Most of the complications encountered were manageable without revision surgery. Level of Evidence: Therapeutic Level IV. See Instructions for Authors for a complete description of levels of evidence.
The Influence of Preoperative Radiographic Patellofemoral Degenerative Changes and Malalignment on Patellofemoral-Specific Outcome Scores Following Fixed-Bearing Medial Unicompartmental Knee Arthroplasty
imageBackground: There is controversy as to whether the presence of degenerative changes and malalignment of the patellofemoral joint is a contraindication to medial unicompartmental knee arthroplasty. Therefore, the aim of the present study was to examine the influence of preoperative radiographic patellofemoral joint osteoarthritis and alignment on intermediate-term knee and patellofemoral joint-specific patient-reported outcomes following fixed-bearing medial unicompartmental knee arthroplasty. Methods: We performed a retrospective review of the records on a consecutive series of patients who had undergone robotic arm-assisted fixed-bearing onlay medial unicompartmental knee arthroplasty and had a minimum duration of follow-up of 2 years. All records were collected from a single surgeon’s arthroplasty registry. Patients with severe bone loss or grooving of the lateral patellar facet were excluded. Radiographic assessment was performed with use of the Kellgren-Lawrence and Altman classification systems as well as with patellofemoral joint alignment measurements. The latest follow-up consisted of a patient-reported questionnaire, including the Kujala (Anterior Knee Pain Scale) score, the Knee Injury and Osteoarthritis Outcome Score (KOOS), Junior (JR), and satisfaction levels. Results: A total of 536 patients (639 knees) were included. After a mean duration of follow-up (and standard deviation) of 4.3 ± 1.6 years (range, 2.0 to 9.2 years), good-to-excellent Kujala scores were reported independent of the presence of patellofemoral joint osteoarthritis preoperatively (Kellgren-Lawrence grade 0 compared with ≥1, p = 0.82; grade ≤1 compared with ≥2, p = 0.84). Similar findings were found when osteoarthritis was present in either the medial or lateral side of the patellofemoral joint as defined by an Altman score of ≥2 (medial, p = 0.81; lateral, p = 0.90). KOOS scores and satisfaction also were not affected by degenerative patellofemoral joint changes. Furthermore, neither the patellar tilt angle nor the congruence angle influenced patient-reported outcomes. Conclusions: Preoperative radiographic mild to moderate patellofemoral joint degeneration (Kellgren-Lawrence grades 1 through 3) and/or malalignment did not compromise intermediate-term knee and patellofemoral joint-specific patient-reported outcomes in patients managed with fixed-bearing medial unicompartmental knee arthroplasty. On the basis of the results of the present study, we believe that neither mild to moderate patellofemoral degeneration nor abnormal patellar tilt or congruence should be considered a contraindication to fixed-bearing medial unicompartmental knee arthroplasty. Level of Evidence: Prognostic Level III. See Instructions for Authors for a complete description of levels of evidence.
Preoperative Risk Factors for Opioid Utilization After Total Hip Arthroplasty
imageBackground: Opioid prescriptions following orthopaedic procedures may contribute to the opioid epidemic in the United States. Risk factors for greater and prolonged opioid utilization following total hip arthroplasty have yet to be fully elucidated. We sought to determine the prevalence of preoperative and postoperative opioid utilization in a cohort of patients who underwent total hip arthroplasty and to identify preoperative risk factors for prolonged utilization of opioids following total hip arthroplasty. Methods: A cohort study of patients who underwent primary elective total hip arthroplasty at Kaiser Permanente from January 2008 to December 2011 was conducted. The number of opioid prescriptions dispensed per 90-day period after total hip arthroplasty (up to 1 year) was the outcome of interest. The risk factors evaluated included preoperative analgesic medication use, patient demographic characteristics, comorbidities, and other history of chronic pain. Poisson regression models were used, and relative risks (RRs) and 95% confidence intervals (CIs) are presented. Results: Of the 12,560 patients who underwent total hip arthroplasty and were identified, 58.5% were female and 78.6% were white. The median age was 67 years (interquartile range, 59 to 75 years). Sixty-three percent of patients filled at least 1 opioid prescription in the 1 year prior to the total hip arthroplasty. Postoperative opioid use went from 88.6% in days 1 to 90 to 24% in the last quarter. An increasing number of preoperative opioid prescriptions was associated with a greater number of prescriptions over the entire postoperative period, with an RR of 1.10 (95% CI, 1.10 to 1.11) at days 271 to 360. Additional factors associated with greater utilization over the entire year included black race, chronic pulmonary disease, anxiety, substance abuse, and back pain. Factors associated with greater utilization in days 91 to 360 (beyond the early recovery phase) included female sex, higher body mass index, acquired immunodeficiency syndrome, peripheral vascular disease, and history of non-specific chronic pain. Conclusions: We identified preoperative factors associated with greater and prolonged opioid utilization long after the early recovery period following total hip arthroplasty. Patients with these risk factors may benefit from targeted multidisciplinary interventions to mitigate the risk of prolonged opioid use. Clinical Relevance: Opioid prescriptions following orthopaedic procedures are one of the leading causes of chronic opioid use; strategies to reduce the risk of misuse and abuse are needed. At 1 year postoperatively, almost one-quarter of patients who underwent total hip arthroplasty used opioids in the last 90 days of the first postoperative year, which makes understanding risk factors associated with postoperative opioid utilization imperative.
Quality Improvement of Magnetic Resonance Imaging for Musculoskeletal Infection in Children Results in Decreased Scan Duration and Decreased Contrast Use
imageBackground: Magnetic resonance imaging (MRI) is a heavily utilized resource to evaluate children suspected to have a musculoskeletal infection. Complex interdisciplinary workflows are involved with decision-making with regard to indications, anesthesia, contrast use, and procedural timing relative to the scan. This study assesses the impact of a quality improvement endeavor on MRI workflows at a tertiary pediatric medical center. Methods: A registry of consecutively enrolled children for a multidisciplinary musculoskeletal infection program identified those evaluated with MRI from 2012 to 2018. Annual MRI process improvement feedback was provided to the key stakeholders. Demographic characteristics, laboratory parameters, MRI indications, anesthesia use, MRI findings, final diagnoses, scan duration, imaging protocol, surgical intervention following MRI, and length of stay were retrospectively compared between the 3 cohorts (initial, middle, and final) representing 2-year increments to assess the impact of the initiative. Results: There were 526 original MRI scans performed to evaluate 1,845 children with suspected musculoskeletal infection. Anesthesia was used in 401 children (76.2%). When comparing the initial, middle, and final study period cohorts, significant improvement was demonstrated for the number of sequences per scan (7.5 sequences for the initial cohort, 5.8 sequences for the middle cohort, and 4.6 sequences for the final cohort; p < 0.00001), scan duration (73.6 minutes for the initial cohort, 52.1 minutes for the middle cohort, and 34.9 minutes for the final cohort; p < 0.00001), anesthesia duration (94.1 minutes for the initial cohort, 68.9 minutes for the middle cohort, and 53.2 minutes for the final cohort; p < 0.00001), and the rate of contrast use (87.6% for the initial cohort, 67.7% for the middle cohort, and 26.3% for the final cohort; p < 0.00001). There was also a trend toward a higher rate of procedures under continued anesthesia immediately following the MRI (70.2% in the initial cohort, 77.8% in the middle cohort, and 84.6% in the final cohort). During the final 6-month period, the mean scan duration was 24.4 minutes, anesthesia duration was 40.9 minutes, and the rate of contrast administration was 8.5%. Conclusions: Progressive quality improvement through collaborative interdisciplinary communication and workflow redesign led to improved utilization of MRI and minimized contrast use for suspected musculoskeletal infection. There was a high rate of procedural intervention under continued anesthesia for children with confirmed musculoskeletal infection. Level of Evidence: Therapeutic Level IV. See Instructions for Authors for a complete description of levels of evidence.
Polymyxin and Bacitracin in the Irrigation Solution Provide No Benefit for Bacterial Killing in Vitro
imageBackground: Many surgeons add topical antibiotics to irrigation solutions assuming that this has a local effect and eliminates bacteria. However, prior studies have suggested that the addition of antibiotics to irrigation solution confers little benefit, adds cost, may potentiate anaphylactic reactions, and may contribute to antimicrobial resistance. We sought to compare the antimicrobial efficacy and cytotoxicity of an irrigation solution containing polymyxin-bacitracin with other commonly used irrigation solutions. Methods: Staphylococcus aureus and Escherichia coli were exposed to irrigation solutions containing topical antibiotics (500,000-U/L polymyxin and 50,000-U/L bacitracin; 1-g/L vancomycin; or 80-mg/L gentamicin), as well as commonly used irrigation solutions (saline solution 0.9%; povidone-iodine 0.3%; chlorhexidine 0.05%; Castile soap 0.45%; and sodium hypochlorite 0.125%). Following 1 and 3 minutes of exposure, surviving bacteria were manually counted. Failure to eradicate all bacteria in any of the 3 replicates was considered not effective for that respective solution. Cytotoxicity analysis in human fibroblasts, osteoblasts, and chondrocytes exposed to the irrigation solutions was performed by visualization of cell structure and was quantified by lactate dehydrogenase (LDH) activity. Efficacy and cytotoxicity were assessed in triplicate experiments, with generalized linear mixed models. Results: Polymyxin-bacitracin, saline solution, and Castile soap at both exposure times were not effective at eradicating S. aureus or E. coli. In contrast, povidone-iodine, chlorhexidine, and sodium hypochlorite irrigation were effective against both S. aureus and E. coli (p < 0.001). Vancomycin irrigation was effective against S. aureus but not against E. coli, whereas gentamicin irrigation showed partial efficacy against E. coli but none against S. aureus. Within fibroblasts, the greatest cytotoxicity was seen with chlorhexidine (mean [and standard error], 49.38% ± 0.80%; p < 0.0001), followed by Castile soap (33.57% ± 2.17%; p < 0.0001) and polymyxin-bacitracin (8.90% ± 1.40%). Povidone-iodine showed the least cytotoxicity of the efficacious solutions (5.00% ± 0.86%). Similar trends were seen at both exposure times and across fibroblasts, osteoblasts, and chondrocytes. Conclusions: Irrigation with polymyxin-bacitracin was ineffective at bacterial eradication, and statistically inferior to povidone-iodine. Chlorhexidine lavage conferred the greatest in vitro cytotoxicity. Clinical Relevance: These data suggest that the addition of polymyxin-bacitracin to saline solution irrigation has little value. Given the cost and antimicrobial resistance implications, our findings, combined with prior clinical literature, provide adequate reason to avoid widespread use of antibiotics in irrigation solutions. Povidone-iodine may be a more effective and safer option.
Association Between Sagittal Spinal Alignment and Physical Function in the Japanese General Elderly Population: A Japanese Cohort Survey Randomly Sampled from a Basic Resident Registry
imageBackground: The extension of healthy life expectancy has become increasingly important because of rising health-care costs and decreases in the quality of life in the elderly population. Although reports have surfaced on an association between sagittal spinal alignment and physical performance, such studies on the healthy population are limited. This study investigated the relationship between sagittal spinal alignment and physical function in the general elderly population. Methods: Registered citizens who were 50 to 89 years of age were targeted for this survey. We established 8 groups based on age (50 to 59 years, 60 to 69 years, 70 to 79 years, and 80 to 89 years) and sex (male and female) after random sampling from the resident registry of the town of Obuse in 2014. A total of 412 people (203 male and 209 female) were enrolled for the measurement and analysis of radiographic parameters of sagittal spinal alignment and physical performance tests. Results: Physical function score values decreased with age, with moderate to strong correlations. Within age subgroups, worsened spinal alignment in standing whole-spinal radiographs indicated diminished physical performance results. The impact of the sagittal vertical axis was especially prominent; as the sagittal vertical axis was shifted forward by 1 standard deviation, 1-leg standing time became shortened by 3.8 seconds. Two-step scores were significantly associated with sagittal vertical axis, global tilt, cervical sagittal vertical axis, and pelvic tilt. Conclusions: Our investigation of sagittal spinal alignment on physical function in a Japanese elderly cohort revealed significant negative correlations between spinal alignment and physical performance after excluding the influence of age and sex. Posture change in the community-dwelling elderly population is an important sign of physical function impairment. Level of Evidence: Prognostic Level IV. See Instructions for Authors for a complete description of levels of evidence.

Δεν υπάρχουν σχόλια:

Δημοσίευση σχολίου

Αρχειοθήκη ιστολογίου

Translate