The presentPhDresearchexplorestheintegrationofvisiondevicesandintelligentsystems to monitorandenhancehumanwell-beinginhealthcareandmanufacturingcontexts,start- ing fromthestandardsproposedinIndustry4.0andaimingtofollowtheprinciplesofthe novelIndustry5.0.Depthsensorsanddeeplearningtechnologieshavebeenexploitedtoad- dress thecriticalaspectsofhumanmobilityassessmentandactionsegmentationinreal,non- simulatedscenarios.TheMicrosoftAzureKinect,astate-of-the-artdepthsensor,hasbeen selectedasakeyinstrumentfordatacollection,andinnovativecameracalibrationmethods havebeendevelopedtoensuretheaccuracyandreliabilityofthegathereddata. Withintherealmofhealthcare,theresearchactivityaddressesthesubstantialchallenges posedbyneurodegenerativediseasesinthewell-beingofolderindividuals.Thispartofthe study focusesonmonitoringandassessingthemobilityofelderlypatients,aimingtosupport remote diagnosisandimprovetheirqualityoflife.Traditionalmobilitytests,administered byhealthcareprofessionals,areessentialforevaluatingmovementskills.Nevertheless,such techniquesoftensufferfromhumansubjectivity,whichcouldleadtoerrorsintheassess- ments. Toaddresssuchissues,video-basedsystemshavebeenstudied,aimingtoremotely monitor andobjectivelyevaluatemobility,reducingtheburdenonelderlypatients. In thecontextofmanufacturing,humanactionsarepivotalinenhancingoperationalef- ficiency,productivity,andsafetyinmanufacturingenvironments.Suchchallengeshaveled to theincreasinguseofindustrialroboticsolutions,mainlyincludingcollaborativerobots, which canshareacommonworkspacewithhumans,carryingouttheirrespectivetaskssimul- taneously.Thispartoftheresearchdelvesintothesegmentationofhumantasksforintel- ligent manufacturingsystems,exploringtheintegrationofvisiondevicesanddeeplearning technologiestoimprovetheefficiencyandaccuracyofmanufacturingprocesses.Ingeneral, the studyofsuchsystemsisaimedatcreatingcomfortableworkenvironments,adaptable to theneedsandabilitiesofindividualpeople,increasingthewell-beingofoperatorsina human-centeredfactoryconcept. The maingoalofthepresentstudyistoevaluatetheeffectivenessofmachinelearning and deeplearningmodelsformobilityassessmentandactionsegmentation,todetermine their suitabilityforhumanmonitoring.However,anotablegapintheliteratureisidentified: the absenceofdatasetsrepresentinghumanactionsinrealisticenvironments.Tobridgethis gap,theresearchincludesthecreationandvalidationofdatasetscapturinghumanactions in healthcareandmanufacturingscenarios,emphasizingtheimportanceofgeneralization acrossdifferentlocations.Byaddressingtheuniquechallengesinbothhealthcareandman- ufacturing,thisstudycontributestothedevelopmentofintelligentsystemsthatpromote human well-beingandenhanceoperationalefficiency,aimingtoalignwiththeparadigmsof Industry 5.0.
Vision devices and intelligent systems for monitoring the well-being of humans in healthcare and manufacturing / Romeo, Laura. - ELETTRONICO. - (2024).
Vision devices and intelligent systems for monitoring the well-being of humans in healthcare and manufacturing
Romeo, Laura
2024-01-01
Abstract
The presentPhDresearchexplorestheintegrationofvisiondevicesandintelligentsystems to monitorandenhancehumanwell-beinginhealthcareandmanufacturingcontexts,start- ing fromthestandardsproposedinIndustry4.0andaimingtofollowtheprinciplesofthe novelIndustry5.0.Depthsensorsanddeeplearningtechnologieshavebeenexploitedtoad- dress thecriticalaspectsofhumanmobilityassessmentandactionsegmentationinreal,non- simulatedscenarios.TheMicrosoftAzureKinect,astate-of-the-artdepthsensor,hasbeen selectedasakeyinstrumentfordatacollection,andinnovativecameracalibrationmethods havebeendevelopedtoensuretheaccuracyandreliabilityofthegathereddata. Withintherealmofhealthcare,theresearchactivityaddressesthesubstantialchallenges posedbyneurodegenerativediseasesinthewell-beingofolderindividuals.Thispartofthe study focusesonmonitoringandassessingthemobilityofelderlypatients,aimingtosupport remote diagnosisandimprovetheirqualityoflife.Traditionalmobilitytests,administered byhealthcareprofessionals,areessentialforevaluatingmovementskills.Nevertheless,such techniquesoftensufferfromhumansubjectivity,whichcouldleadtoerrorsintheassess- ments. Toaddresssuchissues,video-basedsystemshavebeenstudied,aimingtoremotely monitor andobjectivelyevaluatemobility,reducingtheburdenonelderlypatients. In thecontextofmanufacturing,humanactionsarepivotalinenhancingoperationalef- ficiency,productivity,andsafetyinmanufacturingenvironments.Suchchallengeshaveled to theincreasinguseofindustrialroboticsolutions,mainlyincludingcollaborativerobots, which canshareacommonworkspacewithhumans,carryingouttheirrespectivetaskssimul- taneously.Thispartoftheresearchdelvesintothesegmentationofhumantasksforintel- ligent manufacturingsystems,exploringtheintegrationofvisiondevicesanddeeplearning technologiestoimprovetheefficiencyandaccuracyofmanufacturingprocesses.Ingeneral, the studyofsuchsystemsisaimedatcreatingcomfortableworkenvironments,adaptable to theneedsandabilitiesofindividualpeople,increasingthewell-beingofoperatorsina human-centeredfactoryconcept. The maingoalofthepresentstudyistoevaluatetheeffectivenessofmachinelearning and deeplearningmodelsformobilityassessmentandactionsegmentation,todetermine their suitabilityforhumanmonitoring.However,anotablegapintheliteratureisidentified: the absenceofdatasetsrepresentinghumanactionsinrealisticenvironments.Tobridgethis gap,theresearchincludesthecreationandvalidationofdatasetscapturinghumanactions in healthcareandmanufacturingscenarios,emphasizingtheimportanceofgeneralization acrossdifferentlocations.Byaddressingtheuniquechallengesinbothhealthcareandman- ufacturing,thisstudycontributestothedevelopmentofintelligentsystemsthatpromote human well-beingandenhanceoperationalefficiency,aimingtoalignwiththeparadigmsof Industry 5.0.File | Dimensione | Formato | |
---|---|---|---|
36 ciclo-ROMEO Laura.pdf
accesso aperto
Descrizione: Tesi di Dottorato
Tipologia:
Tesi di dottorato
Licenza:
Non specificato
Dimensione
17.16 MB
Formato
Adobe PDF
|
17.16 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.