Author(s):

  • Jiyong Kim
  • Minseo Park

Abstract:

The rate of people suffering from sleep disorders has been continuously increasing in recent years, such that interest in healthy sleep is also naturally increasing. Although there are many health-care industries and services related to sleep, specific and objective evaluation of sleep habits is still lacking. Most of the sleep scores presented in wearable-based sleep health services are calculated based only on the sleep stage ratio, which is not sufficient for studies considering the sleep dimension. In addition, most score generation techniques use weighted expert evaluation models, which are often selected based on experience instead of objective weights. Therefore, this study proposes an objective daily sleep habit score calculation method that considers various sleep factors based on user sleep data and gait data collected from wearable devices. A credit rating model built as a logistic regression model is adapted to generate sleep habit scores for good and bad sleep. Ensemble machine learning is designed to generate sleep habit scores for the intermediate sleep remainder. The sleep habit score and evaluation model of this study are expected to be in demand not only in health-care and health-service applications but also in the financial and insurance sectors.

Documentation:

https://doi.org/10.3390/app13021043

References:
  1. Lee, H.; Kim, J.; Moon, J.; Jung, S.; Jo, Y.; Kim, B.; Ryu, E.; Bahn, S. A study on the changes in life habits, mental health, and sleep quality of college students due to COVID-19. Work 2022, 73, 777–786. [Google Scholar] [CrossRef]
  2. Heuse, S.; Grebe, J.L.; Esken, F. Sleep Hygiene Behaviour in Students: An Intended Strategy to Cope with Stress. J. Med. Psychol. 2022, 24, 23–28. [Google Scholar] [CrossRef]
  3. Freeman, D.; Sheaves, B.; Waite, F.; Harvey, A.G.; Harrison, P.J. Sleep disturbance and psychiatric disorders. Lancet Psychiatry 2020, 7, 628–637. [Google Scholar] [CrossRef] [PubMed]
  4. Bhaskar, S.; Hemavathy, D.; Prasad, S. Prevalence of chronic insomnia in adult patients and its correlation with medical comorbidities. J. Family Med. Prim. Care 2016, 5, 780–784. [Google Scholar] [CrossRef] [PubMed]
  5. Hafner, M.; Stepanek, M.; Taylor, J.; Troxel, W.M.; Van Stolk, C. Why sleep matters—The economic costs of insufficient sleep: A cross-country comparative analysis. Rand Health Q. 2017, 6, 11. [Google Scholar]
  6. Estrada-Galiñanes, V.; Wac, K. Collecting, exploring and sharing personal data: Why, how and where. Data Sci. 2020, 3, 79–106. [Google Scholar] [CrossRef]
  7. Nyman, J.; Ekbladh, E.; Björk, M.; Johansson, P.; Sandqvist, J. Feasibility of a new homebased ballistocardiographic tool for sleep-assessment in a real-life context among workers. Work 2022. [Google Scholar] [CrossRef] [PubMed]
  8. Wei, Q.; Lee, J.H.; Park, H.J. Novel design of smart sleep-lighting system for improving the sleep environment of children. Technol. Health Care 2019, 27, 3–13. [Google Scholar] [CrossRef]
  9. Smyth, C. The Pittsburgh sleep quality index (PSQI). J. Gerontol. Nurs. 1999, 25, 10. [Google Scholar] [CrossRef]
  10. Carpenter, J.S.; Andrykowski, M.A. Psychometric evaluation of the Pittsburgh sleep quality index. J. Psychosom. Res. 1998, 45, 5–13. [Google Scholar] [CrossRef]
  11. Buysse, D.J. Sleep health: Can we define it? Does it matter? Sleep 2014, 37, 9–17. [Google Scholar] [CrossRef] [PubMed]
  12. Morrissey, B.; Taveras, E.; Allender, S.; Strugnell, C. Sleep and obesity among children: A systematic review of multiple sleep dimensions. Pediatr. Obes. 2020, 15, e12619. [Google Scholar] [CrossRef] [PubMed]
  13. Moore, P.J.; Adler, N.E.; Williams, D.R.; Jackson, J.S. Socioeconomic status and health: The role of sleep. Psychosom. Med. 2002, 64, 337–344. [Google Scholar] [CrossRef] [PubMed]
  14. Nishino, S. The Stanford Method for Ultimate Sound Sleep; Sunmark Publishing: Tokyo, Japan, 2017. [Google Scholar]
  15. Patel, A.K.; Reddy, V.; Araujo, J.F. Physiology, Sleep Stages; StatPearls [Internet]: Florida, FL, USA, 2021. [Google Scholar]
  16. Beattie, Z.; Oyang, Y.; Statan, A.; Ghoreyshi, A.; Pantelopoulos, A.; Russell, A.; Heneghan, C.J.P.M. Estimation of sleep stages in a healthy adult population from optical plethysmography and accelerometer signals. Physiol. Meas. 2017, 38, 1968–1979. [Google Scholar] [CrossRef] [PubMed]
  17. Slyusarenko, K.; Fedorin, I. Smart alarm based on sleep stages prediction. Conf. Proc. IEEE Eng. Med. Biol. Soc. 2020, 2020, 4286–4289. [Google Scholar]
  18. Reed, D.L.; Sacco, W.P. Measuring sleep efficiency: What should the denominator be? J. Clin. Sleep Med. 2016, 12, 263–266. [Google Scholar] [CrossRef]
  19. Phillips, A.J.; Clerx, W.M.; O’Brien, C.S.; Sano, A.; Barger, L.K.; Picard, R.W.; Czeisler, C.A. Irregular sleep/wake patterns are associated with poorer academic performance and delayed circadian and sleep/wake timing. Sci. Rep. 2017, 7, 3216. [Google Scholar] [CrossRef]
  20. Lunsford-Avery, J.R.; Engelhard, M.M.; Navar, A.M.; Kollins, S.H. Validation of the sleep regularity index in older adults and associations with cardiometabolic risk. Sci. Rep. 2018, 8, 14158. [Google Scholar] [CrossRef]
  21. Rosenthal, L.; Roehrs, T.A.; Rosen, A.; Roth, T. Level of sleepiness and total sleep time following various time in bed conditions. Sleep 1993, 16, 226–232. [Google Scholar] [CrossRef]
  22. Randler, C.; Vollmer, C.; Kalb, N.; Itzek-Greulich, H. Breakpoints of time in bed, midpoint of sleep, and social jetlag from infancy to early adulthood. Sleep Med. 2019, 57, 80–86. [Google Scholar] [CrossRef]
  23. Cohen, S.; Fulcher, B.D.; Rajaratnam, S.M.; Conduit, R.; Sullivan, J.P.; St Hilaire, M.A.; Phillips, A.J.K.; Loddenkemper, T.; Kothare, S.V.; McConnell, K.; et al. Sleep patterns predictive of daytime challenging behavior in individuals with low-functioning autism. Autism Res. 2018, 11, 391–403. [Google Scholar] [CrossRef] [PubMed]
  24. Zhang, M.L.; Zhou, Z.H. ML-KNN: A lazy learning approach to multi-label learning. Pattern Recognit. 2007, 40, 2038–2048. [Google Scholar] [CrossRef]
  25. Rashid, W.; Gupta, M.K. A Perspective of Missing Value Imputation Approaches. In Advances in Computational Intelligence and Communication Technology; Springer: Singapore, 2021; pp. 307–315. [Google Scholar]
  26. Hearst, M.A.; Dumais, S.T.; Osuna, E.; Platt, J.; Scholkopf, B. Support vector machines. IEEE Intell. Syst. 1998, 13, 18–28. [Google Scholar] [CrossRef]
  27. Biau, G. Analysis of a random forests model. J. Mach. Learn. Res. 2012, 13, 1063–1095. [Google Scholar]
  28. Dong, L.; Martinez, A.J.; Buysse, D.J.; Harvey, A.G. A composite measure of sleep health predicts concurrent mental and physical health outcomes in adolescents prone to eveningness. Sleep Health 2019, 5, 166–174. [Google Scholar] [CrossRef] [PubMed]
  29. Brindle, R.C.; Yu, L.; Buysse, D.J.; Hall, M.H. Empirical derivation of cutoff values for the sleep health metric and its relationship to cardiometabolic morbidity: Results from the Midlife in the United States (MIDUS) study. Sleep 2019, 42, zsz116. [Google Scholar] [CrossRef] [PubMed]
  30. Leung, K.; Cheong, F.; Cheong, C.; O‘Farrell, S.; Tissington, R. Building a Scorecard in Practice. In Proceedings of the 7th International Conference on Computational Intelligence in Economics and Finance, Taoyuan, Taiwan, 5–7 December 2008. [Google Scholar]
  31. Vejkanchana, N.; Kuacharoen, P. Continuous Variable Binning Algorithm to Maximize Information Value Using Genetic Algorithm. In International Conference on Applied Informatics; Springer: Cham, Switzerland, 2019; pp. 158–172. [Google Scholar]
  32. Finlay, S. Data Pre-Processing. In Credit Scoring, Response Modelling and Insurance Rating; Palgrave Macmillan: London, UK, 2010; pp. 144–159. [Google Scholar]
  33. Zdravevski, E.; Lameski, P.; Kulakov, A. Weight of evidence as a tool for attribute transformation in the preprocessing stage of supervised learning algorithms. IJCNN 2011, 181–188. [Google Scholar]
  34. Vanneschi, L.; Horn, D.M.; Castelli, M.; Popovič, A. An artificial intelligence system for predicting customer default in e-commerce. Expert Syst. Appl. 2018, 104, 1–21. [Google Scholar] [CrossRef]
  35. Dastile, X.; Celik, T.; Potsane, M. Statistical and machine learning models in credit scoring: A systematic literature survey. Appl. Soft Comput. 2020, 91, 106263. [Google Scholar] [CrossRef]
  36. Peng, C.Y.J.; Lee, K.L.; Ingersoll, G.M. An introduction to logistic regression analysis and reporting. J. Educ. Res. 2002, 96, 3–14. [Google Scholar] [CrossRef]
  37. Obuchowski, N.A. Receiver operating characteristic curves and their use in radiology. Radiology 2003, 229, 3–8. [Google Scholar] [CrossRef] [PubMed]
  38. Zeng, G. A comparison study of computational methods of Kolmogorov–Smirnov statistic in credit scoring. Commun. Stat. Simul. Comput. 2017, 46, 7744–7760. [Google Scholar] [CrossRef]
  39. Abdou, H.A.; Pointon, J. Credit scoring, statistical techniques and evaluation criteria: A review of the literature. Intell. Syst. Account. Financ. Manag. 2011, 18, 59–88. [Google Scholar] [CrossRef]
  40. Woo, H.S.; Lee, S.H.; Cho, H. Building credit scoring models with various types of target variables. J. Korean Data Inf. Sci. Soc. 2013, 24, 85–94. [Google Scholar]
  41. Park, I. Developing the osteoporosis risk scorecard model in Korean adult women. J. Health Inform. Stat. 2021, 46, 44–53. [Google Scholar] [CrossRef]
  42. Han, J.T.; Park, I.S.; Kang, S.B.; Seo, B.G. Developing the High-Risk Drinking Scorecard Model in Korea. Osong Public Health Res. Perspect. 2018, 9, 231–239. [Google Scholar] [CrossRef]
  43. Siddiqi, N. Credit Risk Scorecards: Developing and Implementing Intelligent Credit Scoring; Wiley & Sons.: Hoboken, NJ, USA, 2012; Volume 3. [Google Scholar]
  44. Divina, F.; Gilson, A.; Goméz-Vela, F.; García Torres, M.; Torres, J.F. Stacking ensemble learning for short-term electricity consumption forecasting. Energies 2018, 11, 949. [Google Scholar] [CrossRef]
  45. Wang, T.; Zhang, K.; Thé, J.; Yu, H. Accurate prediction of band gap of materials using stacking machine learning model. Comput. Mater. Sci. 2022, 201, 110899. [Google Scholar] [CrossRef]
  46. Chen, T.; Guestrin, C. Xgboost: A Scalable Tree Boosting System. In Proceedings of the 22nd ACM Sigkdd International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, 13–17 August 2016; pp. 785–794. [Google Scholar]
  47. Ke, G.; Meng, Q.; Finley, T.; Wang, T.; Chen, W.; Ma, W.; Ye, Q.; Liu, T.Y. Lightgbm: A highly efficient gradient boosting decision tree. Adv. Neural Inf. Process. Syst. 2017, 30, 3149–3157. [Google Scholar]
  48. Dorogush, A.V.; Ershov, V.; Gulin, A. CatBoost: Gradient boosting with categorical features support. arXiv 2018, arXiv:1810.11363. [Google Scholar]
  49. Arik, S.Ö.; Pfister, T. Tabnet: Attentive interpretable tabular learning. AAAI 2021, 35, 6679–6687. [Google Scholar] [CrossRef]
  50. Rasifaghihi, N.; Li, S.S.; Haghighat, F. Forecast of urban water consumption under the impact of climate change. Sustain. Cities Soc. 2020, 52, 101848. [Google Scholar] [CrossRef]
  51. Hans, C. Elastic net regression modeling with the orthant normal prior. JASA 2011, 106, 1383–1393. [Google Scholar] [CrossRef]
  52. Hoerl, A.E.; Kennard, R.W. Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 1970, 12, 55–67. [Google Scholar] [CrossRef]
  53. Gunst, R.F.; Mason, R.L. Biased estimation in regression: An evaluation using mean squared error. JASA 1977, 72, 616–628. [Google Scholar] [CrossRef]
  54. Akiba, T.; Sano, S.; Yanase, T.; Ohta, T.; Koyama, M. Optuna: A Next-Generation Hyperparameter Optimization Framework. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, Anchorage, AK, USA, 4–8 August 2019; ACM: New York, NY, USA, 2019. [Google Scholar]
  55. Halson, S.L.; Johnston, R.D.; Piromalli, L.; Lalor, B.J.; Cormack, S.; Roach, G.D.; Sargent, C. Sleep Regularity and Predictors of Sleep Efficiency and Sleep Duration in Elite Team Sport Athletes. Sport. Med. Open 2022, 8, 79. [Google Scholar] [CrossRef]
  56. Windred, D.P.; Jones, S.E.; Russell, A.; Burns, A.C.; Chan, P.; Weedon, M.N.; Rutter, M.K.; Olivier, P.; Vetter, C.; Saxena, R.; et al. Objective assessment of sleep regularity in 60 000 UK Biobank participants using an open-source package. Sleep 2021, 44, zsab254. [Google Scholar] [CrossRef]
  57. Makarem, N.; Zuraikat, F.M.; Aggarwal, B.; Jelic, S.; St-Onge, M.P. Variability in sleep patterns: An emerging risk factor for hypertension. Curr. Hypertens. Rep. 2020, 22, 19. [Google Scholar] [CrossRef]
  58. Baron, K.G.; Reid, K.J.; Malkani, R.G.; Kang, J.; Zee, P.C. Sleep variability among older adults with insomnia: Associations with sleep quality and cardiometabolic disease risk. Behav. Sleep Med. 2017, 15, 144–157. [Google Scholar] [CrossRef]
  59. Buman, M.P.; Phillips, B.A.; Youngstedt, S.D.; Kline, C.E.; Hirshkowitz, M. Does nighttime exercise really disturb sleep? Results from the 2013 National Sleep Foundation Sleep in America Poll. Sleep Med. 2014, 15, 755–761. [Google Scholar] [CrossRef]
  60. Stutz, J.; Eiholzer, R.; Spengler, C.M. Effects of evening exercise on sleep in healthy participants: A systematic review and meta-analysis. Sport. Med. 2019, 49, 269–287. [Google Scholar] [CrossRef] [PubMed]
  61. Frimpong, E.; Mograss, M.; Zvionow, T.; Dang-Vu, T.T. The effects of evening high-intensity exercise on sleep in healthy adults: A systematic review and meta-analysis. Sleep Med. Rev. 2021, 60, 101535. [Google Scholar] [CrossRef]
  62. Kim, J.; Lee, J.; Park, M. Identification of Smartwatch-Collected Lifelog Variables Affecting Body Mass Index in Middle-Aged People Using Regression Machine Learning Algorithms and SHapley Additive Explanations. Appl. Sci. 2022, 12, 3819. [Google Scholar] [CrossRef]
  63. Liang, Z.; CHAPA-MARTELL, M.A. Predicting Medical-Grade Sleep-Wake Classification from Fitbit Data Using Tree-Based Machine Learning. Rep. Number IPSJ SIG Tech. Rep. 2019, 2019, 14. [Google Scholar]
  64. Jiang, M.; Liu, J.; Zhang, L.; Liu, C. An improved Stacking framework for stock index prediction by leveraging tree-based ensemble models and deep learning algorithms. Phys. A Stat. Mech. Appl. 2020, 541, 122272. [Google Scholar] [CrossRef]
  65. Pavlyshenko, B. Using Stacking Approaches for Machine Learning Models. In Proceedings of the 2018 IEEE Second International Conference on Data Stream Mining & Processing (DSMP), Lviv, Ukraine, 21–25 August 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 255–258. [Google Scholar]
  66. Yu, W.; Li, S.; Ye, T.; Xu, R.; Song, J.; Guo, Y. Deep ensemble machine learning framework for the estimation of PM 2.5 concentrations. Environ. Health Perspect. 2022, 130, 037004. [Google Scholar] [CrossRef] [PubMed]