Real-Time Machine Vision System for Quality Control in Packaging Lines
Authorship
M.K.M.H.
University Master in Computer Vision
M.K.M.H.
University Master in Computer Vision
Defense date
02.03.2025 12:00
02.03.2025 12:00
Summary
This paper presents a real-time machine vision system deployed on a packaging line at Financiera Maderera S.A., a leading Spanish manufacturer of engineered wood products. The system tackles the challenge of detecting open boxes in high-speed production. It features a custom image acquisition setup and image processing to identify defects. Integrated seamlessly into the packaging line, the system achieves 99.89\% overall accuracy, with 96.31\% precision and 96.56\% recall for detecting open boxes, all within less than 0.25 seconds per box. These results demonstrate the potential of the system to improve quality control in high speed packaging lines applications in various sectors.
This paper presents a real-time machine vision system deployed on a packaging line at Financiera Maderera S.A., a leading Spanish manufacturer of engineered wood products. The system tackles the challenge of detecting open boxes in high-speed production. It features a custom image acquisition setup and image processing to identify defects. Integrated seamlessly into the packaging line, the system achieves 99.89\% overall accuracy, with 96.31\% precision and 96.56\% recall for detecting open boxes, all within less than 0.25 seconds per box. These results demonstrate the potential of the system to improve quality control in high speed packaging lines applications in various sectors.
Direction
BREA SANCHEZ, VICTOR MANUEL (Tutorships)
Ferreiro Miranda, José Miguel (Co-tutorships)
BREA SANCHEZ, VICTOR MANUEL (Tutorships)
Ferreiro Miranda, José Miguel (Co-tutorships)
Court
Cernadas García, Eva (Chairman)
Pardo López, Xosé Manuel (Secretary)
FLORES GONZALEZ, JULIAN CARLOS (Member)
Cernadas García, Eva (Chairman)
Pardo López, Xosé Manuel (Secretary)
FLORES GONZALEZ, JULIAN CARLOS (Member)
Research and development of revenue prediction models in the hospitality sector
Authorship
C.B.L.
Master in artificial intelligence
C.B.L.
Master in artificial intelligence
Defense date
02.20.2025 11:00
02.20.2025 11:00
Summary
The hotel industry is a dynamic environment where constant adjustments to room prices are necessary to optimize each hotel’s revenue. In this context, the ability to accurately forecast future revenue is key for strategic decision-making and price optimization. Forecast revenue refers to the ability to predict the earnings a hotel will generate on a given day considering all the available information up to the moment the prediction is made, regardless of how far in advance the forecast is made. In this project, using Machine Learning techniques, revenue prediction models are developed for Eurostars Hotel Company, addressing challenges related to data quality, computational costs, and the need for interpretability. Using Data Science techniques, the limitations of the dataset are analyzed and issues such as high dimensionality, dataset size and the presence of outliers are addressed. The effectiveness of generalized models covering multiple hotels is explored to reduce the total number of models and enable revenue forecasting for hotels with insufficient historical data. Additionally, a separation is introduced in the prediction based on lead time (short-term and long-term), allowing for improved accuracy during critical sales periods. Finally, the results of these implementations are evaluated both individually and in comparison with the previous models in use, drawing conclusions about their effectiveness and utility for the company.
The hotel industry is a dynamic environment where constant adjustments to room prices are necessary to optimize each hotel’s revenue. In this context, the ability to accurately forecast future revenue is key for strategic decision-making and price optimization. Forecast revenue refers to the ability to predict the earnings a hotel will generate on a given day considering all the available information up to the moment the prediction is made, regardless of how far in advance the forecast is made. In this project, using Machine Learning techniques, revenue prediction models are developed for Eurostars Hotel Company, addressing challenges related to data quality, computational costs, and the need for interpretability. Using Data Science techniques, the limitations of the dataset are analyzed and issues such as high dimensionality, dataset size and the presence of outliers are addressed. The effectiveness of generalized models covering multiple hotels is explored to reduce the total number of models and enable revenue forecasting for hotels with insufficient historical data. Additionally, a separation is introduced in the prediction based on lead time (short-term and long-term), allowing for improved accuracy during critical sales periods. Finally, the results of these implementations are evaluated both individually and in comparison with the previous models in use, drawing conclusions about their effectiveness and utility for the company.
Direction
LAMA PENIN, MANUEL (Tutorships)
Comesaña García, Alejandra (Co-tutorships)
Freire Martínez, Ignacio (Co-tutorships)
LAMA PENIN, MANUEL (Tutorships)
Comesaña García, Alejandra (Co-tutorships)
Freire Martínez, Ignacio (Co-tutorships)
Court
TABOADA IGLESIAS, MARÍA JESÚS (Chairman)
Pardo López, Xosé Manuel (Secretary)
VILA BLANCO, NICOLAS (Member)
TABOADA IGLESIAS, MARÍA JESÚS (Chairman)
Pardo López, Xosé Manuel (Secretary)
VILA BLANCO, NICOLAS (Member)
Kinetic modelling of fermentation for the valorisation of waste into carboxylates via lactic acid production
Authorship
S.G.G.
Master in Environmental Engineering (3rd ed)
S.G.G.
Master in Environmental Engineering (3rd ed)
Defense date
02.20.2025 11:30
02.20.2025 11:30
Summary
The circular economy proposes a paradigm shift in waste management, promoting its conversion into reusable and recoverable resources. In this context, the growing generation of organic waste, such as food waste, is driving the development of biorefineries, which transform this waste into products such as biofuels and chemical compounds with high added value. One agri-food waste with high valorisation potential is cheese whey, a very abundant by-product generated in cheese production. Anaerobic fermentation is positioned as a key process in waste valorisation. Through the use of mixed cultures, which offer metabolic flexibility, robustness to environmental fluctuations and the ability to process non-sterile and complex substrates, it allows the conversion of a wide variety of wastes into volatile fatty acids (VFA), which have numerous applications as precursors for biopolymers, biofuels and chemicals. VFA can also be transformed into medium-chain fatty acids (MCFA), such as caproate, which have economic advantages over VFA. MCFA are produced by a chain elongation process, which requires electron donors such as ethanol or lactate. A key advantage of this process is the possibility to substitute external donors with lactate, produced in open fermentation processes from lactose-rich organic residues such as cheese whey, thus optimising the production of MCFA. The use of mixed cultures also presents certain challenges, such as competition between different microbial species. Moreover, with the use of waste as substrates, these challenges increase due to their complex composition. Mathematical models are essential tools for the preliminary design of biotechnological processes, as they allow the simulation and prediction of microbial behaviour under specific conditions, such as pH, hydraulic residence time (HRT) and substrate concentration. This predictive capability facilitates process design, allowing adjustment of operating conditions without the need for costly and time-consuming experiments. Although models for anaerobic fermentation processes exist, they do not integrate chain elongation, which limits their application by not fully describing the conversion of complex substrates to MCFA. In this work, an anaerobic fermentation kinetic model was developed that integrates lactate-mediated chain elongation to produce MCFA from carbohydrates and lactate present in cheese whey. A metabolic network based on model microorganisms was constructed that includes the main processes related to lactate production and consumption. The model was calibrated by estimating key kinetic parameters using experimental data and was used for the design of in silico experiments.
The circular economy proposes a paradigm shift in waste management, promoting its conversion into reusable and recoverable resources. In this context, the growing generation of organic waste, such as food waste, is driving the development of biorefineries, which transform this waste into products such as biofuels and chemical compounds with high added value. One agri-food waste with high valorisation potential is cheese whey, a very abundant by-product generated in cheese production. Anaerobic fermentation is positioned as a key process in waste valorisation. Through the use of mixed cultures, which offer metabolic flexibility, robustness to environmental fluctuations and the ability to process non-sterile and complex substrates, it allows the conversion of a wide variety of wastes into volatile fatty acids (VFA), which have numerous applications as precursors for biopolymers, biofuels and chemicals. VFA can also be transformed into medium-chain fatty acids (MCFA), such as caproate, which have economic advantages over VFA. MCFA are produced by a chain elongation process, which requires electron donors such as ethanol or lactate. A key advantage of this process is the possibility to substitute external donors with lactate, produced in open fermentation processes from lactose-rich organic residues such as cheese whey, thus optimising the production of MCFA. The use of mixed cultures also presents certain challenges, such as competition between different microbial species. Moreover, with the use of waste as substrates, these challenges increase due to their complex composition. Mathematical models are essential tools for the preliminary design of biotechnological processes, as they allow the simulation and prediction of microbial behaviour under specific conditions, such as pH, hydraulic residence time (HRT) and substrate concentration. This predictive capability facilitates process design, allowing adjustment of operating conditions without the need for costly and time-consuming experiments. Although models for anaerobic fermentation processes exist, they do not integrate chain elongation, which limits their application by not fully describing the conversion of complex substrates to MCFA. In this work, an anaerobic fermentation kinetic model was developed that integrates lactate-mediated chain elongation to produce MCFA from carbohydrates and lactate present in cheese whey. A metabolic network based on model microorganisms was constructed that includes the main processes related to lactate production and consumption. The model was calibrated by estimating key kinetic parameters using experimental data and was used for the design of in silico experiments.
Direction
MAURICIO IGLESIAS, MIGUEL (Tutorships)
REGUEIRA LOPEZ, ALBERTE (Co-tutorships)
Catenacci , Arianna (Co-tutorships)
MAURICIO IGLESIAS, MIGUEL (Tutorships)
REGUEIRA LOPEZ, ALBERTE (Co-tutorships)
Catenacci , Arianna (Co-tutorships)
Court
Rojo Alboreca, Alberto (Chairman)
GONZALEZ GARCIA, SARA (Secretary)
FERNANDEZ ESCRIBANO, JOSE ANGEL (Member)
Rojo Alboreca, Alberto (Chairman)
GONZALEZ GARCIA, SARA (Secretary)
FERNANDEZ ESCRIBANO, JOSE ANGEL (Member)
Evaluating wastewater treatment plants removal efficiency for contaminants of emerging concern
Authorship
I.R.R.
Master in Environmental Engineering (3rd ed)
I.R.R.
Master in Environmental Engineering (3rd ed)
Defense date
02.20.2025 17:00
02.20.2025 17:00
Summary
The European Union published its first Directive on wastewater treatment in 1991. Thirty-three years later, the Directive has been revised, imposing stricter treatment requirements and explicitly including the need to eliminate micropollutants, which include contaminants of emerging concern. For this reason, the new Wastewater Directive includes a list of twelve indicator compounds whose removal efficiency could be related to the elimination of many other emerging contaminants. Regarding the general state of water bodies, the Water Framework Directive is applicable, which already includes a series of chemical compounds that must be systematically monitored to ensure water quality. Additionally, throughout the latest years, the European Union has published four watch lists, which include various substances requiring control and monitoring in water bodies to determine their effects on environmental, animal, and human health. Considering that the main entry of contaminants into the water cycle is through discharges from wastewater treatment plants (WWTPs), as well as industrial or accidental discharges, it is crucial to analyze the removal efficiency of the twelve compounds included in the new Wastewater Directive and other contaminants listed in the watch lists using current WWTP treatments. This constitutes the general objective of this study. Two WWTPs and the receiving water bodies connected to their discharges were studied. As an approach to an additional quaternary treatment, the performance of a pilot ozonation system for the removal of these same contaminants was evaluated. An analytical method based on solid-phase extraction and high-performance liquid chromatography coupled with mass spectrometry was developed for the quantification of the compounds. The method proved to be effective for most of the target compounds. The results obtained from samples of the receiving water bodies showed that the effluent discharge does not increase the concentrations of contaminants in the environment, mainly due to dilution effect. The removal efficiency of the measured compounds was calculated, showing significant variability depending on the compound analyzed but reporting similar results across the different WWTPs. The removal percentages do not meet the criteria established by the new legislation, highlighting the need to implement additional treatments focused on the elimination of emerging contaminants. On the other hand, ozonation assays demonstrate that this technology is effective in eliminating many contaminants, achieving efficiencies above 90% for most of the measured compounds.
The European Union published its first Directive on wastewater treatment in 1991. Thirty-three years later, the Directive has been revised, imposing stricter treatment requirements and explicitly including the need to eliminate micropollutants, which include contaminants of emerging concern. For this reason, the new Wastewater Directive includes a list of twelve indicator compounds whose removal efficiency could be related to the elimination of many other emerging contaminants. Regarding the general state of water bodies, the Water Framework Directive is applicable, which already includes a series of chemical compounds that must be systematically monitored to ensure water quality. Additionally, throughout the latest years, the European Union has published four watch lists, which include various substances requiring control and monitoring in water bodies to determine their effects on environmental, animal, and human health. Considering that the main entry of contaminants into the water cycle is through discharges from wastewater treatment plants (WWTPs), as well as industrial or accidental discharges, it is crucial to analyze the removal efficiency of the twelve compounds included in the new Wastewater Directive and other contaminants listed in the watch lists using current WWTP treatments. This constitutes the general objective of this study. Two WWTPs and the receiving water bodies connected to their discharges were studied. As an approach to an additional quaternary treatment, the performance of a pilot ozonation system for the removal of these same contaminants was evaluated. An analytical method based on solid-phase extraction and high-performance liquid chromatography coupled with mass spectrometry was developed for the quantification of the compounds. The method proved to be effective for most of the target compounds. The results obtained from samples of the receiving water bodies showed that the effluent discharge does not increase the concentrations of contaminants in the environment, mainly due to dilution effect. The removal efficiency of the measured compounds was calculated, showing significant variability depending on the compound analyzed but reporting similar results across the different WWTPs. The removal percentages do not meet the criteria established by the new legislation, highlighting the need to implement additional treatments focused on the elimination of emerging contaminants. On the other hand, ozonation assays demonstrate that this technology is effective in eliminating many contaminants, achieving efficiencies above 90% for most of the measured compounds.
Direction
RODIL RODRIGUEZ, MARIA DEL ROSARIO (Tutorships)
MONTES GOYANES, ROSA MARIA (Co-tutorships)
RODIL RODRIGUEZ, MARIA DEL ROSARIO (Tutorships)
MONTES GOYANES, ROSA MARIA (Co-tutorships)
Court
CARBALLA ARCOS, MARTA (Chairman)
VAL DEL RIO, MARIA ANGELES (Secretary)
PARADELO NUÑEZ, REMIGIO (Member)
CARBALLA ARCOS, MARTA (Chairman)
VAL DEL RIO, MARIA ANGELES (Secretary)
PARADELO NUÑEZ, REMIGIO (Member)
Improving the nutritional quality of forages for cattle through fertilization with Se, I and B: an approach to improve breeding and prevent diseases
Authorship
V.S.V.
Master in Environmental Engineering (3rd ed)
V.S.V.
Master in Environmental Engineering (3rd ed)
Defense date
02.20.2025 16:00
02.20.2025 16:00
Summary
This study explores the impact of fertilization with selenium, boron, and iodine on the quality and productivity of ryegrass (Lolium sp.), a key forage crop in Galicia. Given the importance of pastures in the region’s livestock systems, this research proposes an innovative fertilizer formulation that includes these micronutrients to enhance both the nutritional value of forage and livestock health, ultimately improving the quality of derived products such as milk. The experiment was conducted in pots under ambient conditions in Vilanova de Arousa, using four treatments: the experimental fertilizer combined with another nitrogen-based formulation (SeBI), a commercial fertilizer that includes selenium, nitrogen, and sulfur (Y), nitrogen-only fertilizer (BA), and a control without fertilization (BB). Parameters such as growth, dry weight, nutrient concentration in the plant, and soil properties before and after the trial were analyzed. The results indicate that SeBI and Y treatments improved ryegrass growth, increasing biomass and nutritional content. However, productivity was lower than expected (4000 kg DM/ha), possibly due to root restriction in pots and nutrient leaching. The iodine analysis revealed higher accumulation in SeBI, confirming the positive effect of fertilization with this element. However, selenium and boron accumulation could not be verified. This research underscores the role of targeted fertilization in maximizing forage production, livestock systems, and potentially enriching the food chain with more nutritious products.
This study explores the impact of fertilization with selenium, boron, and iodine on the quality and productivity of ryegrass (Lolium sp.), a key forage crop in Galicia. Given the importance of pastures in the region’s livestock systems, this research proposes an innovative fertilizer formulation that includes these micronutrients to enhance both the nutritional value of forage and livestock health, ultimately improving the quality of derived products such as milk. The experiment was conducted in pots under ambient conditions in Vilanova de Arousa, using four treatments: the experimental fertilizer combined with another nitrogen-based formulation (SeBI), a commercial fertilizer that includes selenium, nitrogen, and sulfur (Y), nitrogen-only fertilizer (BA), and a control without fertilization (BB). Parameters such as growth, dry weight, nutrient concentration in the plant, and soil properties before and after the trial were analyzed. The results indicate that SeBI and Y treatments improved ryegrass growth, increasing biomass and nutritional content. However, productivity was lower than expected (4000 kg DM/ha), possibly due to root restriction in pots and nutrient leaching. The iodine analysis revealed higher accumulation in SeBI, confirming the positive effect of fertilization with this element. However, selenium and boron accumulation could not be verified. This research underscores the role of targeted fertilization in maximizing forage production, livestock systems, and potentially enriching the food chain with more nutritious products.
Direction
MONTERROSO MARTINEZ, MARIA DEL CARMEN (Tutorships)
Gago Otero, Rafael (Co-tutorships)
MONTERROSO MARTINEZ, MARIA DEL CARMEN (Tutorships)
Gago Otero, Rafael (Co-tutorships)
Court
HOSPIDO QUINTANA, ALMUDENA (Chairman)
PARADELO NUÑEZ, REMIGIO (Secretary)
CARBALLA ARCOS, MARTA (Member)
HOSPIDO QUINTANA, ALMUDENA (Chairman)
PARADELO NUÑEZ, REMIGIO (Secretary)
CARBALLA ARCOS, MARTA (Member)
Valorization of lignocelluosic biomass into medium-chain fatty acids through anaerobic fermentation in mixed cullture.
Authorship
A.R.G.
Master in Environmental Engineering (3rd ed)
A.R.G.
Master in Environmental Engineering (3rd ed)
Defense date
02.20.2025 11:00
02.20.2025 11:00
Summary
The growing concern for the environment and the depletion of fossil resources has increased interest in the search for processes capable of replacing the use of non-renewable resources and reducing the carbon footprint. The circular economy is key to reducing the amount of waste generated and producing high-value-added compounds, and organic waste is a crucial raw material in this endeavor. In this context, lignocellulosic biomass stands out compared to other raw materials due to its vast abundance. Lignocellulosic biomass generally comes from agricultural and forestry residues and is composed of cellulose, hemicellulose, and lignin, with the content of each fraction varying significantly depending on its origin. The composition of these fractions determines the properties of lignocellulosic biomass as well as its potential applications. Hemicellulose is the most biodegradable fraction, followed by cellulose, while lignin hinders most applications. The most established process today is anaerobic digestion, but biogas is a lower-value product compared to volatile fatty acids (VFAs) or medium-chain fatty acids (MCFAs), which are highly demanded in industries such as pharmaceuticals and cosmetics. The process to obtain VFAs is anaerobic fermentation, which follows the steps of anaerobic digestion while avoiding acetogenesis and methanogenesis to obtain VFAs. From these, through a chain elongation process and the presence of electron donors, MCFAs (primarily caproic acid) are obtained. The general objective of this master's thesis is to deepen the understanding of the valorization of lignocellulosic biomass for the production of MCFAs. To achieve this, the production of caproic acid is evaluated using a synthetic mixture of cellulose and xylan, as well as corn stover. To reach this objective, a fed-batch sequencing batch reactor (SBR) is operated with cycles of 2 and 3.5 days and two feedings per cycle. The results of this thesis show that a sequential reactor with two substrate feedings is suitable for MCFA formation. The absence of BES addition and the control of the solid retention time (SRT) through reactor purges resulted in a higher production of caproic acid compared to the opposite conditions. In the experiments conducted with corn residue, a limited substrate conversion was observed due to low hydrolysis, and it was shown that the presence of VFAs limits mold growth in the reactor. For future research, it would be interesting to conduct experiments with synthetic substrates to study other parameters, such as the volatile suspended solids concentration (VSS) or the relationship between substrate loadings. Regarding experiments with real corn stover residue, implementing a pretreatment to improve substrate conversion would be a promising approach.
The growing concern for the environment and the depletion of fossil resources has increased interest in the search for processes capable of replacing the use of non-renewable resources and reducing the carbon footprint. The circular economy is key to reducing the amount of waste generated and producing high-value-added compounds, and organic waste is a crucial raw material in this endeavor. In this context, lignocellulosic biomass stands out compared to other raw materials due to its vast abundance. Lignocellulosic biomass generally comes from agricultural and forestry residues and is composed of cellulose, hemicellulose, and lignin, with the content of each fraction varying significantly depending on its origin. The composition of these fractions determines the properties of lignocellulosic biomass as well as its potential applications. Hemicellulose is the most biodegradable fraction, followed by cellulose, while lignin hinders most applications. The most established process today is anaerobic digestion, but biogas is a lower-value product compared to volatile fatty acids (VFAs) or medium-chain fatty acids (MCFAs), which are highly demanded in industries such as pharmaceuticals and cosmetics. The process to obtain VFAs is anaerobic fermentation, which follows the steps of anaerobic digestion while avoiding acetogenesis and methanogenesis to obtain VFAs. From these, through a chain elongation process and the presence of electron donors, MCFAs (primarily caproic acid) are obtained. The general objective of this master's thesis is to deepen the understanding of the valorization of lignocellulosic biomass for the production of MCFAs. To achieve this, the production of caproic acid is evaluated using a synthetic mixture of cellulose and xylan, as well as corn stover. To reach this objective, a fed-batch sequencing batch reactor (SBR) is operated with cycles of 2 and 3.5 days and two feedings per cycle. The results of this thesis show that a sequential reactor with two substrate feedings is suitable for MCFA formation. The absence of BES addition and the control of the solid retention time (SRT) through reactor purges resulted in a higher production of caproic acid compared to the opposite conditions. In the experiments conducted with corn residue, a limited substrate conversion was observed due to low hydrolysis, and it was shown that the presence of VFAs limits mold growth in the reactor. For future research, it would be interesting to conduct experiments with synthetic substrates to study other parameters, such as the volatile suspended solids concentration (VSS) or the relationship between substrate loadings. Regarding experiments with real corn stover residue, implementing a pretreatment to improve substrate conversion would be a promising approach.
Direction
CARBALLA ARCOS, MARTA (Tutorships)
Iglesias Riobó, Juan (Co-tutorships)
CARBALLA ARCOS, MARTA (Tutorships)
Iglesias Riobó, Juan (Co-tutorships)
Court
Rojo Alboreca, Alberto (Chairman)
GONZALEZ GARCIA, SARA (Secretary)
FERNANDEZ ESCRIBANO, JOSE ANGEL (Member)
Rojo Alboreca, Alberto (Chairman)
GONZALEZ GARCIA, SARA (Secretary)
FERNANDEZ ESCRIBANO, JOSE ANGEL (Member)
Assessing farm sustainability from a Planetary Boundaries perspective: A global extension of Life Cycle Assessment.
Authorship
M.G.P.
Master in Environmental Engineering (3rd ed)
M.G.P.
Master in Environmental Engineering (3rd ed)
Defense date
02.21.2025 09:30
02.21.2025 09:30
Summary
The search for more appropriate production models with lower environmental burdens in the livestock sector is considered one of the key axes to promote sustainability and bioeconomy in this sector. Although, currently, there is a great controversy about the environmental impacts of this productive sector, it is necessary to carry out a detailed analysis to evaluate, in depth, the environmental loads, necessary improvements and, even, the technologies used in the production systems. It is also necessary to consider the difference between extensive and intensive production systems, with the aim of identifying which model is more appropriate in terms of impacts, efficiency and technologies. Given this context, the objective of this Master's Thesis is the assessment of the environmental effects, at a planetary level, that dairy farms imply milk and meat production. To this end, primary data from 100 dairy farms in Galicia, of different sizes, with diverse technologies and production capacities, were considered, to which the Life Cycle Assessment methodology was applied under the perspective of planetary boundaries. Specifically, three boundaries were assessed: climate change, ocean acidification and ozone depletion. The results showed that although the individual contributions of farms are insignificant compared to the planetary boundaries, their accumulation on a global scale has a significant impact. In addition, it was shown that the individual contributions of farms are dependent on several factors, such as size, age and technology used. Therefore, a single conclusion cannot be reached, but the environmental assessment and the burden on the planetary boundaries must be carried out in a particular way for each farm. Although certain conclusions can be reached once the 100 farms are grouped according to technologies, production capacities and exploitation time. On the other hand, regarding the products obtained (milk and meat), milk is, without a doubt, the main environmental contributor, representing 94% of the economic allocation of the impact.
The search for more appropriate production models with lower environmental burdens in the livestock sector is considered one of the key axes to promote sustainability and bioeconomy in this sector. Although, currently, there is a great controversy about the environmental impacts of this productive sector, it is necessary to carry out a detailed analysis to evaluate, in depth, the environmental loads, necessary improvements and, even, the technologies used in the production systems. It is also necessary to consider the difference between extensive and intensive production systems, with the aim of identifying which model is more appropriate in terms of impacts, efficiency and technologies. Given this context, the objective of this Master's Thesis is the assessment of the environmental effects, at a planetary level, that dairy farms imply milk and meat production. To this end, primary data from 100 dairy farms in Galicia, of different sizes, with diverse technologies and production capacities, were considered, to which the Life Cycle Assessment methodology was applied under the perspective of planetary boundaries. Specifically, three boundaries were assessed: climate change, ocean acidification and ozone depletion. The results showed that although the individual contributions of farms are insignificant compared to the planetary boundaries, their accumulation on a global scale has a significant impact. In addition, it was shown that the individual contributions of farms are dependent on several factors, such as size, age and technology used. Therefore, a single conclusion cannot be reached, but the environmental assessment and the burden on the planetary boundaries must be carried out in a particular way for each farm. Although certain conclusions can be reached once the 100 farms are grouped according to technologies, production capacities and exploitation time. On the other hand, regarding the products obtained (milk and meat), milk is, without a doubt, the main environmental contributor, representing 94% of the economic allocation of the impact.
Direction
FEIJOO COSTA, GUMERSINDO (Tutorships)
Arias Calvo, Ana (Co-tutorships)
FEIJOO COSTA, GUMERSINDO (Tutorships)
Arias Calvo, Ana (Co-tutorships)
Court
PRIETO LAMAS, BEATRIZ LORETO (Chairman)
RIOS VIQUEIRA, JOSE RAMON (Secretary)
ROMERO CASTRO, NOELIA MARIA (Member)
PRIETO LAMAS, BEATRIZ LORETO (Chairman)
RIOS VIQUEIRA, JOSE RAMON (Secretary)
ROMERO CASTRO, NOELIA MARIA (Member)
Calculation of the water footprint of a banking entity with international activity
Authorship
A.B.G.
Master in Environmental Engineering (3rd ed)
A.B.G.
Master in Environmental Engineering (3rd ed)
Defense date
02.21.2025 10:45
02.21.2025 10:45
Summary
Access to and sustainable management of water is currently a global challenge due to population growth, urbanization, and climate change, which increase demand for the resource and affect its availability. In this context, the use of environmental assessment tools, such as life cycle analysis and impact calculation methodologies, allows for the quantification of the water usage impact across different sectors, including the financial sector. This study analyzes the water footprint of a banking entity with international presence, applying the ISO 14046 methodology. Both the direct water footprint (associated with water consumption in offices and branches) and the indirect water footprint (related to energy, fuel, and material consumption) are evaluated. Databases such as Ecoinvent and environmental assessment methodologies like AWARE, ReCiPe, and USETox are used. The results show that the water footprint is distributed differently across regions, between direct and indirect impacts, with the latter predominating due to energy consumption. Countries such as Peru and Turkey show the highest water footprint values, while Uruguay shows the lowest, due to its lower water stress and higher use of renewable energy. The study helps identify critical points from an environmental perspective, opening the possibility to focus efforts on areas with the greatest impact. Additionally, simulated scenarios, such as the influence of employee mobility or the use of renewable energy, are discussed, as they could have a significant impact in future studies.
Access to and sustainable management of water is currently a global challenge due to population growth, urbanization, and climate change, which increase demand for the resource and affect its availability. In this context, the use of environmental assessment tools, such as life cycle analysis and impact calculation methodologies, allows for the quantification of the water usage impact across different sectors, including the financial sector. This study analyzes the water footprint of a banking entity with international presence, applying the ISO 14046 methodology. Both the direct water footprint (associated with water consumption in offices and branches) and the indirect water footprint (related to energy, fuel, and material consumption) are evaluated. Databases such as Ecoinvent and environmental assessment methodologies like AWARE, ReCiPe, and USETox are used. The results show that the water footprint is distributed differently across regions, between direct and indirect impacts, with the latter predominating due to energy consumption. Countries such as Peru and Turkey show the highest water footprint values, while Uruguay shows the lowest, due to its lower water stress and higher use of renewable energy. The study helps identify critical points from an environmental perspective, opening the possibility to focus efforts on areas with the greatest impact. Additionally, simulated scenarios, such as the influence of employee mobility or the use of renewable energy, are discussed, as they could have a significant impact in future studies.
Direction
FEIJOO COSTA, GUMERSINDO (Tutorships)
FEIJOO COSTA, GUMERSINDO (Tutorships)
Court
PRIETO LAMAS, BEATRIZ LORETO (Chairman)
Triñanes Fernández, Joaquín Ángel (Secretary)
ROMERO CASTRO, NOELIA MARIA (Member)
PRIETO LAMAS, BEATRIZ LORETO (Chairman)
Triñanes Fernández, Joaquín Ángel (Secretary)
ROMERO CASTRO, NOELIA MARIA (Member)
Techno-economic analysis of enzymatic PET recycling using immobilized enzyme
Authorship
L.G.G.
Master in Chemical and Bioprocess Engineering
L.G.G.
Master in Chemical and Bioprocess Engineering
Defense date
02.19.2025 11:15
02.19.2025 11:15
Summary
Plastics, especially polyethylene terephthalate or PET, have transformed our society, resulting in a large number of advances, although their massive production has caused serious environmental problems. Currently, only a small portion of the plastics produced is correctly recycled, making it necessary to develop solutions that improve the management of this waste. This study performs a techno-economic analysis of the production and application of PET-degrading enzymes, expressed both in encapsulated form using nanospheres and in free form. Specifically, the conceptual design of two industrial plants is proposed, including the stage of production of the E. coli bacteria in which the enzyme will be produced, the separation and purification of the target enzyme, the degradation of the residual plastic, and finally, the purification of the reaction products. After analyzing the costs and economic indicators of the baseline cases, different scenarios are studied in which key parameters are modified, such as the yield of microorganism and enzyme production, and the percentage of plastic degradation. The results indicate that, although economic viability is not achieved in any of the proposed cases, the limiting step of the process is the degradation of the polymer. Once the critical point is identified, the study concludes by proposing different improvement options to achieve economic viability through process optimization and, consequently, turn this technology into a competitive option in the PET recycling market.
Plastics, especially polyethylene terephthalate or PET, have transformed our society, resulting in a large number of advances, although their massive production has caused serious environmental problems. Currently, only a small portion of the plastics produced is correctly recycled, making it necessary to develop solutions that improve the management of this waste. This study performs a techno-economic analysis of the production and application of PET-degrading enzymes, expressed both in encapsulated form using nanospheres and in free form. Specifically, the conceptual design of two industrial plants is proposed, including the stage of production of the E. coli bacteria in which the enzyme will be produced, the separation and purification of the target enzyme, the degradation of the residual plastic, and finally, the purification of the reaction products. After analyzing the costs and economic indicators of the baseline cases, different scenarios are studied in which key parameters are modified, such as the yield of microorganism and enzyme production, and the percentage of plastic degradation. The results indicate that, although economic viability is not achieved in any of the proposed cases, the limiting step of the process is the degradation of the polymer. Once the critical point is identified, the study concludes by proposing different improvement options to achieve economic viability through process optimization and, consequently, turn this technology into a competitive option in the PET recycling market.
Direction
EIBES GONZALEZ, GEMMA MARIA (Tutorships)
LU CHAU, THELMO ALEJANDRO (Co-tutorships)
EIBES GONZALEZ, GEMMA MARIA (Tutorships)
LU CHAU, THELMO ALEJANDRO (Co-tutorships)
Court
HOSPIDO QUINTANA, ALMUDENA (Chairman)
FRANCO RUIZ, DANIEL JOSE (Secretary)
GONZALEZ GARCIA, SARA (Member)
HOSPIDO QUINTANA, ALMUDENA (Chairman)
FRANCO RUIZ, DANIEL JOSE (Secretary)
GONZALEZ GARCIA, SARA (Member)
Canola oil oleogels obtained by indirect method using hybrid carrageenan as a structuring agent
Authorship
C.I.G.
Master in Chemical and Bioprocess Engineering
C.I.G.
Master in Chemical and Bioprocess Engineering
Defense date
02.19.2025 10:15
02.19.2025 10:15
Summary
In recent decades, humans have been making a considerable effort to improve their health and living conditions. One of the ways to achieve this is related to eating habits. Replacing saturated fats with unsaturated fats reduces the chances of suffering from heart disease and overweight. However, saturated fats have a series of mechanical and organoleptic characteristics that oils do not have. Therefore, this work focuses on the investigation of oil structuring techniques that achieve a final product with the best of both worlds. The preparation and study of canola oil emusion gels has been carried out using carrageenan as a structuring agent. This process involves the application of the emulsion template method, due to the difficulty of dispersing the oil in the aqueous phase that contains the gelling agent. For its part, carrageenan has been extracted from the Chondrus crispus algae species using a hot bath immersion technique, and its molecular weight has been determined by capillary viscometry. The determination of mechanical properties under different conditions is essential. For this reason, rheological studies have been carried out on the emulsion and gels that allow these properties to be related to the variation in the concentration of carrageenan, salt and the oil water ratio, as well as determining the thermal reversibility and hysteresis phenomena of the gels. Furthermore, taking into account the potential applications in the food and pharmaceutical industries, oil and water retention, texture and colour tests have been carried out, within a quality control process, and the results have shown a relationship with reheology. The results obtained are promising, thanks to which, future challenges are proposed in order to achieve the use of these gels at an industrial level in the short term.
In recent decades, humans have been making a considerable effort to improve their health and living conditions. One of the ways to achieve this is related to eating habits. Replacing saturated fats with unsaturated fats reduces the chances of suffering from heart disease and overweight. However, saturated fats have a series of mechanical and organoleptic characteristics that oils do not have. Therefore, this work focuses on the investigation of oil structuring techniques that achieve a final product with the best of both worlds. The preparation and study of canola oil emusion gels has been carried out using carrageenan as a structuring agent. This process involves the application of the emulsion template method, due to the difficulty of dispersing the oil in the aqueous phase that contains the gelling agent. For its part, carrageenan has been extracted from the Chondrus crispus algae species using a hot bath immersion technique, and its molecular weight has been determined by capillary viscometry. The determination of mechanical properties under different conditions is essential. For this reason, rheological studies have been carried out on the emulsion and gels that allow these properties to be related to the variation in the concentration of carrageenan, salt and the oil water ratio, as well as determining the thermal reversibility and hysteresis phenomena of the gels. Furthermore, taking into account the potential applications in the food and pharmaceutical industries, oil and water retention, texture and colour tests have been carried out, within a quality control process, and the results have shown a relationship with reheology. The results obtained are promising, thanks to which, future challenges are proposed in order to achieve the use of these gels at an industrial level in the short term.
Direction
SINEIRO TORRES, JORGE (Tutorships)
MOREIRA MARTINEZ, RAMON FELIPE (Co-tutorships)
SINEIRO TORRES, JORGE (Tutorships)
MOREIRA MARTINEZ, RAMON FELIPE (Co-tutorships)
Court
GONZALEZ ALVAREZ, JULIA (Coordinator)
GARRIDO FERNANDEZ, JUAN MANUEL (Chairman)
GONZALEZ GARCIA, SARA (Secretary)
GONZALEZ ALVAREZ, JULIA (Coordinator)
GARRIDO FERNANDEZ, JUAN MANUEL (Chairman)
GONZALEZ GARCIA, SARA (Secretary)
Technoeconomic and environmental analysis of a biorefinery focused on the valorization of olive stones
Authorship
S.C.A.
Master in Chemical and Bioprocess Engineering
S.C.A.
Master in Chemical and Bioprocess Engineering
Defense date
02.19.2025 09:45
02.19.2025 09:45
Summary
Climate change represents a crucial challenge for agriculture, affecting food availability due to climate variability and alterations in rainfall patterns. The olive is one of the main crops in our country, with Spain being the world's largest producer of both table olives and olive oil. Specifically, in 2023 around 767,000 tons of olive oil were produced in Spain. The olive oil production process involves the generation of waste streams, such as olive pits, branches or leaves, which together account for up to 10 per cent of the weight of the olive. This study focuses on the search for a new life for this waste in accordance with the principles of the circular economy. To do this, the design of a biorefinery will be carried out for the valorization of this waste according to experimental data, also proceeding to its simulation, its environmental profile will be identified, and finally, a techno-economic analysis of the system will be carried out.
Climate change represents a crucial challenge for agriculture, affecting food availability due to climate variability and alterations in rainfall patterns. The olive is one of the main crops in our country, with Spain being the world's largest producer of both table olives and olive oil. Specifically, in 2023 around 767,000 tons of olive oil were produced in Spain. The olive oil production process involves the generation of waste streams, such as olive pits, branches or leaves, which together account for up to 10 per cent of the weight of the olive. This study focuses on the search for a new life for this waste in accordance with the principles of the circular economy. To do this, the design of a biorefinery will be carried out for the valorization of this waste according to experimental data, also proceeding to its simulation, its environmental profile will be identified, and finally, a techno-economic analysis of the system will be carried out.
Direction
GONZALEZ GARCIA, SARA (Tutorships)
GONZALEZ GARCIA, SARA (Tutorships)
Court
GARRIDO FERNANDEZ, JUAN MANUEL (Chairman)
MAURICIO IGLESIAS, MIGUEL (Secretary)
GONZALEZ ALVAREZ, JULIA (Member)
GARRIDO FERNANDEZ, JUAN MANUEL (Chairman)
MAURICIO IGLESIAS, MIGUEL (Secretary)
GONZALEZ ALVAREZ, JULIA (Member)
Assessment of the quality of industrial wastewater treated by reverse osmosis and its reuse feasibility
Authorship
A.L.D.L.T.
Master in Chemical and Bioprocess Engineering
A.L.D.L.T.
Master in Chemical and Bioprocess Engineering
Defense date
02.19.2025 11:45
02.19.2025 11:45
Summary
Industrial wastewater poses a significant environmental challenge. While each industry generates its own specific wastes, all industries, regardless of sector, produce wastewater. Industrial discharges are regulated by laws that set maximum limits for various pollutants, along with acceptable ranges for parameters such as pH, conductivity, and temperature. However, even when these regulations are met, the large volumes of discharge can still raise environmental concerns. Additionally, water scarcity, exacerbated by climate change, further affects communities and ecosystems. Consequently, it is crucial to implement policies that minimize the use of natural resources to ensure long-term sustainability. The Pontevedra pulp mill operates its own water treatment plant, which processes the wastewater generated during production before discharging into the Pontevedra estuary. To address these issues, the plant has recently upgraded its tertiary treatment system by incorporating a reverse osmosis stage. This enhancement allows for the recirculation of a portion of the treated water, significantly reducing both the intake of fresh water and the volume of wastewater discharged. The aim of this project is to analyze the improvement in water quality achieved through the reverse osmosis system and determine whether the treated water resembles the fresh water used in the production process. Additionally, considering the high energy costs associated with this system, energy costs will be evaluated, and potential solutions to reduce these costs will be explored.
Industrial wastewater poses a significant environmental challenge. While each industry generates its own specific wastes, all industries, regardless of sector, produce wastewater. Industrial discharges are regulated by laws that set maximum limits for various pollutants, along with acceptable ranges for parameters such as pH, conductivity, and temperature. However, even when these regulations are met, the large volumes of discharge can still raise environmental concerns. Additionally, water scarcity, exacerbated by climate change, further affects communities and ecosystems. Consequently, it is crucial to implement policies that minimize the use of natural resources to ensure long-term sustainability. The Pontevedra pulp mill operates its own water treatment plant, which processes the wastewater generated during production before discharging into the Pontevedra estuary. To address these issues, the plant has recently upgraded its tertiary treatment system by incorporating a reverse osmosis stage. This enhancement allows for the recirculation of a portion of the treated water, significantly reducing both the intake of fresh water and the volume of wastewater discharged. The aim of this project is to analyze the improvement in water quality achieved through the reverse osmosis system and determine whether the treated water resembles the fresh water used in the production process. Additionally, considering the high energy costs associated with this system, energy costs will be evaluated, and potential solutions to reduce these costs will be explored.
Direction
MOSQUERA CORRAL, ANUSKA (Tutorships)
Raposo Fernández, Fernanda (Co-tutorships)
MOSQUERA CORRAL, ANUSKA (Tutorships)
Raposo Fernández, Fernanda (Co-tutorships)
Court
HOSPIDO QUINTANA, ALMUDENA (Chairman)
FRANCO RUIZ, DANIEL JOSE (Secretary)
GONZALEZ GARCIA, SARA (Member)
HOSPIDO QUINTANA, ALMUDENA (Chairman)
FRANCO RUIZ, DANIEL JOSE (Secretary)
GONZALEZ GARCIA, SARA (Member)
From Synthetic to Real Object Detection with Unsupervised Domain Adaptation
Authorship
P.G.P.
University Master in Computer Vision
P.G.P.
University Master in Computer Vision
Defense date
02.03.2025 11:30
02.03.2025 11:30
Summary
Object detection has made remarkable progress in recent years, driven by advancements in deep learning and the availability of large-scale annotated datasets. However, these methods often require extensive labeled data, which may not be accessible for specific or emerging applications. This limitation has generated interest in Unsupervised Domain Adaptation (UDA), which facilitates knowledge transfer from a labeled source domain to an unlabeled or differently distributed target domain. This study addresses the challenge of Domain Adaptation between synthetic and real-world data. A novel methodology for generating synthetic datasets has been developed using AirSim and Unreal Engine, enabling the creation of highly customizable and diverse datasets. Two different Domain Adaptation techniques were employed: D3T, derived from UDA state-of-the-art, and MixPL, adopted from Semi-Supervised Learning. Different enhancements were proposed to maximize the utility of the synthetic dataset and improve domain alignment: In D3T case, different modifications have been proposed in the cost function while in MixPL, a new detector with a transformer backbone is incorporated to the original architecture. The results obtained through D3T and MixPL demonstrate encouraging progress, as they approach the performance of models trained directly on the target domain. These findings suggest that synthetic datasets have significant potential in addressing data scarcity and improving model generalization, while also pointing to promising directions for further exploration in this area.
Object detection has made remarkable progress in recent years, driven by advancements in deep learning and the availability of large-scale annotated datasets. However, these methods often require extensive labeled data, which may not be accessible for specific or emerging applications. This limitation has generated interest in Unsupervised Domain Adaptation (UDA), which facilitates knowledge transfer from a labeled source domain to an unlabeled or differently distributed target domain. This study addresses the challenge of Domain Adaptation between synthetic and real-world data. A novel methodology for generating synthetic datasets has been developed using AirSim and Unreal Engine, enabling the creation of highly customizable and diverse datasets. Two different Domain Adaptation techniques were employed: D3T, derived from UDA state-of-the-art, and MixPL, adopted from Semi-Supervised Learning. Different enhancements were proposed to maximize the utility of the synthetic dataset and improve domain alignment: In D3T case, different modifications have been proposed in the cost function while in MixPL, a new detector with a transformer backbone is incorporated to the original architecture. The results obtained through D3T and MixPL demonstrate encouraging progress, as they approach the performance of models trained directly on the target domain. These findings suggest that synthetic datasets have significant potential in addressing data scarcity and improving model generalization, while also pointing to promising directions for further exploration in this area.
Direction
MUCIENTES MOLINA, MANUEL FELIPE (Tutorships)
CORES COSTA, DANIEL (Co-tutorships)
MUCIENTES MOLINA, MANUEL FELIPE (Tutorships)
CORES COSTA, DANIEL (Co-tutorships)
Court
Cernadas García, Eva (Chairman)
Pardo López, Xosé Manuel (Secretary)
FLORES GONZALEZ, JULIAN CARLOS (Member)
Cernadas García, Eva (Chairman)
Pardo López, Xosé Manuel (Secretary)
FLORES GONZALEZ, JULIAN CARLOS (Member)
Enhancing pixel feature extraction with Quantum Machine Learning
Authorship
D.B.F.P.
University Master in Computer Vision
D.B.F.P.
University Master in Computer Vision
Defense date
02.03.2025 10:30
02.03.2025 10:30
Summary
In this work, we analysed the use of Quantum Machine Learning (QML) models to enhance pixel feature extraction. The main objective was to study whether using QML models can enhance feature quality and drastically reduce the number of parameters needed for training a model. The image features were extracted using a four qubit Variational Quantum Circuit (VQC), and they were effectively trained using a shift rule algorithm. The performance achieved with the QML model outperformed classical deep learning models for small datasets. However, as the dataset size increased, the extracted features became less relevant, leading to diminished performance compared to the classical approaches, demanding further analysis and an increase in the number of VQC qubits and layers. We can conclude that QML arises as a new paradigm to enhance feature extraction performance and drastically reduce the number of trainable parameters.
In this work, we analysed the use of Quantum Machine Learning (QML) models to enhance pixel feature extraction. The main objective was to study whether using QML models can enhance feature quality and drastically reduce the number of parameters needed for training a model. The image features were extracted using a four qubit Variational Quantum Circuit (VQC), and they were effectively trained using a shift rule algorithm. The performance achieved with the QML model outperformed classical deep learning models for small datasets. However, as the dataset size increased, the extracted features became less relevant, leading to diminished performance compared to the classical approaches, demanding further analysis and an increase in the number of VQC qubits and layers. We can conclude that QML arises as a new paradigm to enhance feature extraction performance and drastically reduce the number of trainable parameters.
Direction
VILA BLANCO, NICOLAS (Tutorships)
Fernández Pena, Anselmo Tomás (Co-tutorships)
VILA BLANCO, NICOLAS (Tutorships)
Fernández Pena, Anselmo Tomás (Co-tutorships)
Court
Cernadas García, Eva (Chairman)
Pardo López, Xosé Manuel (Secretary)
FLORES GONZALEZ, JULIAN CARLOS (Member)
Cernadas García, Eva (Chairman)
Pardo López, Xosé Manuel (Secretary)
FLORES GONZALEZ, JULIAN CARLOS (Member)
Deep Learning-Assisted Segmentation of Intracranial Hemorrhages in Brain CT Scans: nnUNet, MedNeXt and Ensamble architectures.
Authorship
M.P.F.
University Master in Computer Vision
M.P.F.
University Master in Computer Vision
Defense date
02.03.2025 12:30
02.03.2025 12:30
Summary
This study compared different deep learning models for Intracranial Hemorrhage (ICH) segmentation in CT scans, including 2D nnU-Net, 3D nnU-Net, and MedNeXt, as well as an ensemble model combining 2D and 3D nnU-Nets. The models were trained on 200 images with 5-fold cross-validation for 1,000 epochs and tested on unseen data. MedNeXt achieved the highest performance with a Dice score of 0.6433+/-0.2417, followed by the 2D-3D nnU-Net ensemble at 0.6405+/-0.2256. The 2D and 3D nnU-Net models scored 0.6348+/-0.2179 and 0.6331+/-0.2636 , respectively. MedNeXt’s superior performance highlights the benefit of incorporating transformers to capture long-range dependencies and global context, though at the cost of longer training times due to their added complexity. The ensemble model showed that combining 2D and 3D U-Net architectures can improve segmentation accuracy without increasing model complexity. Despite 3D U-Net’s potential advantages, 2D and 3D nnU-Net models performed similarly, indicating either configuration can be effective depending on the use case. These findings indicate that MedNeXt's superior performance in ICH segmentation (Dice score 0.6433) makes it the preferred choice for clinical implementation, while the ensemble approach offers a robust alternative that achieves comparable results (Dice score 0.6405) without requiring architectural modifications, providing flexibility for different clinical deployment scenarios.
This study compared different deep learning models for Intracranial Hemorrhage (ICH) segmentation in CT scans, including 2D nnU-Net, 3D nnU-Net, and MedNeXt, as well as an ensemble model combining 2D and 3D nnU-Nets. The models were trained on 200 images with 5-fold cross-validation for 1,000 epochs and tested on unseen data. MedNeXt achieved the highest performance with a Dice score of 0.6433+/-0.2417, followed by the 2D-3D nnU-Net ensemble at 0.6405+/-0.2256. The 2D and 3D nnU-Net models scored 0.6348+/-0.2179 and 0.6331+/-0.2636 , respectively. MedNeXt’s superior performance highlights the benefit of incorporating transformers to capture long-range dependencies and global context, though at the cost of longer training times due to their added complexity. The ensemble model showed that combining 2D and 3D U-Net architectures can improve segmentation accuracy without increasing model complexity. Despite 3D U-Net’s potential advantages, 2D and 3D nnU-Net models performed similarly, indicating either configuration can be effective depending on the use case. These findings indicate that MedNeXt's superior performance in ICH segmentation (Dice score 0.6433) makes it the preferred choice for clinical implementation, while the ensemble approach offers a robust alternative that achieves comparable results (Dice score 0.6405) without requiring architectural modifications, providing flexibility for different clinical deployment scenarios.
Direction
BREA SANCHEZ, VICTOR MANUEL (Tutorships)
Gómez González, Juan Pablo (Co-tutorships)
BREA SANCHEZ, VICTOR MANUEL (Tutorships)
Gómez González, Juan Pablo (Co-tutorships)
Court
Cernadas García, Eva (Chairman)
Pardo López, Xosé Manuel (Secretary)
FLORES GONZALEZ, JULIAN CARLOS (Member)
Cernadas García, Eva (Chairman)
Pardo López, Xosé Manuel (Secretary)
FLORES GONZALEZ, JULIAN CARLOS (Member)
Sustainability of Enzyme Production from the Perspective of Life Cycle Analysis and Green Chemistry Principles
Authorship
A.C.N.
Master in Environmental Engineering (3rd ed)
A.C.N.
Master in Environmental Engineering (3rd ed)
Defense date
02.20.2025 16:30
02.20.2025 16:30
Summary
Many consumers seek sustainable alternatives due to the environmental issues posed by conventional plastics. A green polymer similar to PET (polyethylene terephthalate) is PEF (polyethylene furanoate), emerging as a more eco-friendly solution, produced from renewable sources. This study presents the Life Cycle Assessment (LCA) results of HMFO (hydromethylfurfural oxidase) enzyme production, essential for PEF synthesis, from a cradle-to-gate perspective. Four production scenarios were evaluated, starting from a laboratory scale of 5 L and scaling up to 1 m3, with intermediate scenarios of 25 and 50 L, aiming to analyze environmental impacts at an industrial scale. The analysis revealed that glucose, energy, and K2HPO4 are the primary contributors to the environmental impact of the process. Among the various stages, the fermentation and purification phases showed the most significant environmental impacts. In contrast, the pre-inoculum and inoculum stages demonstrated comparatively lower impacts, largely due to their shorter operational periods. Additionally, scaling up production to a capacity of 1 m3 led to a marked decrease in overall environmental impacts, highlighting improved resource efficiency and a reduction in waste generation. When comparing this study with other research in the literature, it was found that this process demonstrates a higher degree of sustainability. The application of green chemistry principles was central to this assessment, with sustainability evaluated through 15 distinct metrics. This methodology underscored that the 1 m3 production scale is substantially more sustainable than laboratory-scale operations. Furthermore, when it comes to the comparison of the environmental impacts associated with this process and those from studies on enzyme production, direct comparisons with this Master Final Project were limited due to discrepancies in the functional units and the impact categories considered.
Many consumers seek sustainable alternatives due to the environmental issues posed by conventional plastics. A green polymer similar to PET (polyethylene terephthalate) is PEF (polyethylene furanoate), emerging as a more eco-friendly solution, produced from renewable sources. This study presents the Life Cycle Assessment (LCA) results of HMFO (hydromethylfurfural oxidase) enzyme production, essential for PEF synthesis, from a cradle-to-gate perspective. Four production scenarios were evaluated, starting from a laboratory scale of 5 L and scaling up to 1 m3, with intermediate scenarios of 25 and 50 L, aiming to analyze environmental impacts at an industrial scale. The analysis revealed that glucose, energy, and K2HPO4 are the primary contributors to the environmental impact of the process. Among the various stages, the fermentation and purification phases showed the most significant environmental impacts. In contrast, the pre-inoculum and inoculum stages demonstrated comparatively lower impacts, largely due to their shorter operational periods. Additionally, scaling up production to a capacity of 1 m3 led to a marked decrease in overall environmental impacts, highlighting improved resource efficiency and a reduction in waste generation. When comparing this study with other research in the literature, it was found that this process demonstrates a higher degree of sustainability. The application of green chemistry principles was central to this assessment, with sustainability evaluated through 15 distinct metrics. This methodology underscored that the 1 m3 production scale is substantially more sustainable than laboratory-scale operations. Furthermore, when it comes to the comparison of the environmental impacts associated with this process and those from studies on enzyme production, direct comparisons with this Master Final Project were limited due to discrepancies in the functional units and the impact categories considered.
Direction
MOREIRA VILAR, MARIA TERESA (Tutorships)
Arias Calvo, Ana (Co-tutorships)
MOREIRA VILAR, MARIA TERESA (Tutorships)
Arias Calvo, Ana (Co-tutorships)
Court
HOSPIDO QUINTANA, ALMUDENA (Chairman)
PARADELO NUÑEZ, REMIGIO (Secretary)
CARBALLA ARCOS, MARTA (Member)
HOSPIDO QUINTANA, ALMUDENA (Chairman)
PARADELO NUÑEZ, REMIGIO (Secretary)
CARBALLA ARCOS, MARTA (Member)
Production of Volatile Fatty Acids taking into account sustainability and circularity dimensions taking into account dimensions of sustainability and circularity
Authorship
A.S.L.
Master in Environmental Engineering (3rd ed)
A.S.L.
Master in Environmental Engineering (3rd ed)
Defense date
02.20.2025 12:00
02.20.2025 12:00
Summary
The increase in water scarcity, climate change, and the intensive exploitation of natural resources have a negative impact on the water cycle. These factors diminish water availability, putting both human communities and the natural ecosystems that depend on this resource at risk. Therefore, it is essential to implement innovative and sustainable solutions that ensure access to clean water and protect the environment. A key strategy for addressing these issues is wastewater treatment. This process helps reduce pollution by improving the quality of treated water. Moreover, it not only ensures water sustainability by returning water to water bodies but also allows for the recovery of valuable resources such as nutrients and energy. Within the European Union, the European Taxonomy has been established as a legislative framework designed to promote sustainable economic activities in alignment with six fundamental environmental objectives: climate change mitigation, climate change adaptation, the sustainable use and protection of aquatic and marine resources, the transition to a circular economy, pollution prevention and control, and the protection and restoration of biodiversity and ecosystems. Under this framework, activities related to wastewater treatment and biowaste management stand out for their ability to contribute to climate change mitigation and foster a circular economy. This Master’s Final Project focuses on the analysis of three specific case studies: the Os Praceres Wastewater Treatment Plant, the Lagares Wastewater Treatment Plant and a pilot plant dedicated to producing volatile fatty acids from sludge and biowaste. In each case, the European Taxonomy methodology was applied to evaluate which activities align with the sustainability objectives defined by this regulatory framework. Additionally, the indicators from the BioReCer project were used to assess the development of these facilities in terms of circularity and sustainability
The increase in water scarcity, climate change, and the intensive exploitation of natural resources have a negative impact on the water cycle. These factors diminish water availability, putting both human communities and the natural ecosystems that depend on this resource at risk. Therefore, it is essential to implement innovative and sustainable solutions that ensure access to clean water and protect the environment. A key strategy for addressing these issues is wastewater treatment. This process helps reduce pollution by improving the quality of treated water. Moreover, it not only ensures water sustainability by returning water to water bodies but also allows for the recovery of valuable resources such as nutrients and energy. Within the European Union, the European Taxonomy has been established as a legislative framework designed to promote sustainable economic activities in alignment with six fundamental environmental objectives: climate change mitigation, climate change adaptation, the sustainable use and protection of aquatic and marine resources, the transition to a circular economy, pollution prevention and control, and the protection and restoration of biodiversity and ecosystems. Under this framework, activities related to wastewater treatment and biowaste management stand out for their ability to contribute to climate change mitigation and foster a circular economy. This Master’s Final Project focuses on the analysis of three specific case studies: the Os Praceres Wastewater Treatment Plant, the Lagares Wastewater Treatment Plant and a pilot plant dedicated to producing volatile fatty acids from sludge and biowaste. In each case, the European Taxonomy methodology was applied to evaluate which activities align with the sustainability objectives defined by this regulatory framework. Additionally, the indicators from the BioReCer project were used to assess the development of these facilities in terms of circularity and sustainability
Direction
MOREIRA VILAR, MARIA TERESA (Tutorships)
MOREIRA VILAR, MARIA TERESA (Tutorships)
Court
FERNANDEZ ESCRIBANO, JOSE ANGEL (Chairman)
MAURICIO IGLESIAS, MIGUEL (Secretary)
GONZALEZ GARCIA, SARA (Member)
FERNANDEZ ESCRIBANO, JOSE ANGEL (Chairman)
MAURICIO IGLESIAS, MIGUEL (Secretary)
GONZALEZ GARCIA, SARA (Member)
Evaluation of the impact of an activator in technosoils versus the conventional process
Authorship
X.R.B.
Master in Environmental Engineering (3rd ed)
X.R.B.
Master in Environmental Engineering (3rd ed)
Defense date
02.21.2025 10:15
02.21.2025 10:15
Summary
The Mina de Touro is a metal sulfide deposit, exploited during 1974-1988 to obtain copper. Due to this mining operation, a great environmental alteration was caused. Materials exposed to the open air, where there is no trace of vegetation or soil, generate large amounts of hyperacid drainage with high concentrations of heavy metals. With the aim of solving this environmental problem, the restoration of the mine is currently being carried out with the application of reducing and buffering technosols. Throughout this work, it is intended to carry out a study to optimize the production process of these technosols, studying the viability of using an accelerator of the composting process. Likewise, the advantages that this additive brings to the composting process compared to the conventional process will be analyzed. Throughout the study, the following points will be addressed to determine the suitability or not of an accelerator product: - Composting times - Composting temperatures - Stabilization of organic matter - Microbiological differences - Elimination of odors - Economic impact In addition, tests will also be carried out on the doses and forms of application of the accelerator in question.
The Mina de Touro is a metal sulfide deposit, exploited during 1974-1988 to obtain copper. Due to this mining operation, a great environmental alteration was caused. Materials exposed to the open air, where there is no trace of vegetation or soil, generate large amounts of hyperacid drainage with high concentrations of heavy metals. With the aim of solving this environmental problem, the restoration of the mine is currently being carried out with the application of reducing and buffering technosols. Throughout this work, it is intended to carry out a study to optimize the production process of these technosols, studying the viability of using an accelerator of the composting process. Likewise, the advantages that this additive brings to the composting process compared to the conventional process will be analyzed. Throughout the study, the following points will be addressed to determine the suitability or not of an accelerator product: - Composting times - Composting temperatures - Stabilization of organic matter - Microbiological differences - Elimination of odors - Economic impact In addition, tests will also be carried out on the doses and forms of application of the accelerator in question.
Direction
GONZALEZ GARCIA, SARA (Tutorships)
Castro Fernández, Noelia (Co-tutorships)
GONZALEZ GARCIA, SARA (Tutorships)
Castro Fernández, Noelia (Co-tutorships)
Court
PRIETO LAMAS, BEATRIZ LORETO (Chairman)
RIOS VIQUEIRA, JOSE RAMON (Secretary)
ROMERO CASTRO, NOELIA MARIA (Member)
PRIETO LAMAS, BEATRIZ LORETO (Chairman)
RIOS VIQUEIRA, JOSE RAMON (Secretary)
ROMERO CASTRO, NOELIA MARIA (Member)
2D vision-based control of wood timber packaging processes
Authorship
A.R.C.
University Master in Computer Vision
A.R.C.
University Master in Computer Vision
Defense date
02.03.2025 13:00
02.03.2025 13:00
Summary
Within the wood industry, the automation of processing lines is becoming increasingly essential to improve efficiency and reduce manual labour. This project specifically addresses the automation of a dry timber packaging line by integrating two ABB robotic arms for handling. However, controlling all process variables in a real industrial environment proved challenging. In order to address this, a vision-based control system was implemented, using a 2D matrix camera to verify the position and condition of each layer of planks before being handled by the robots. This vision system is crucial for ensuring the accuracy and reliability of the packaging, as it detects any misalignment or defects that could compromise the operation. By integrating computer vision, the system not only enhances the overall efficiency but also significantly improves the precision and safety of the process. This paper presents the design, implementation, and impact of the vision system within the broader context of industrial automation.
Within the wood industry, the automation of processing lines is becoming increasingly essential to improve efficiency and reduce manual labour. This project specifically addresses the automation of a dry timber packaging line by integrating two ABB robotic arms for handling. However, controlling all process variables in a real industrial environment proved challenging. In order to address this, a vision-based control system was implemented, using a 2D matrix camera to verify the position and condition of each layer of planks before being handled by the robots. This vision system is crucial for ensuring the accuracy and reliability of the packaging, as it detects any misalignment or defects that could compromise the operation. By integrating computer vision, the system not only enhances the overall efficiency but also significantly improves the precision and safety of the process. This paper presents the design, implementation, and impact of the vision system within the broader context of industrial automation.
Direction
BREA SANCHEZ, VICTOR MANUEL (Tutorships)
Ferreiro Miranda, José Miguel (Co-tutorships)
BREA SANCHEZ, VICTOR MANUEL (Tutorships)
Ferreiro Miranda, José Miguel (Co-tutorships)
Court
Cernadas García, Eva (Chairman)
Pardo López, Xosé Manuel (Secretary)
FLORES GONZALEZ, JULIAN CARLOS (Member)
Cernadas García, Eva (Chairman)
Pardo López, Xosé Manuel (Secretary)
FLORES GONZALEZ, JULIAN CARLOS (Member)
Microstructural study of olive oil oleogels with chitosan and chitin from pickering emulsions
Authorship
S.V.M.
Master in Chemical and Bioprocess Engineering
S.V.M.
Master in Chemical and Bioprocess Engineering
Defense date
02.19.2025 09:15
02.19.2025 09:15
Summary
One of the main objectives of the current food industry is to replace harmful fats for health with oils or fats that have similar organoleptic properties and a good nutritional profile. This master's thesis studies the development of oleogels from vegetable oils, specifically from olive oil. The emulsion is created from an aqueous phase of chitosan as a structuring agent and vanillin, which, when reacting with chitosan, forms a Schiff base and creates a three-dimensional network that traps the oil. Additionally, chitin is incorporated in the form of particles to stabilize the emulsion through the Pickering effect. Subsequently, water is removed from the emulsions through convective drying to obtain a dry emulsion, and after crushing it, the final oleogel is obtained. The main objective of this work is to study the impact of chitin particle concentration, 0.5, 1.0, 1.5, and 2.0 percent, on the properties of the emulsion and oleogel, both in the presence and absence of vanillin. The rheological properties of the emulsion were studied through oscillatory tests. First, strain sweeps were conducted to determine the linear viscoelastic regime of the emulsions, and then the mechanical spectra were measured to observe the variation of elastic and viscous modules with frequency. In the oleogel, rheological properties were analyzed through strain and frequency sweeps, along with quality tests such as colorimetry, texture, and oil retention. The rheological tests showed that the emulsions exhibited a predominantly viscous behavior at chitin concentrations lower than 1.5 percent by weight, and at higher concentrations, the behavior was inverted. The rheological behavior of the emulsions was modeled with the Cross-Williamson model for systems without chitin, and an empirical term was added based on particle concentration for systems with chitin, which allowed for the creation of a valid complex viscosity model for the tested concentration range. The droplet size distribution of the emulsions was also analyzed using optical microscopy images, showing that the droplet diameter decreased with increasing chitin concentration. Regarding the oleogels, convective drying at 70 degrees Celsius was modeled using the Page model, verifying that with increasing chitin concentration, the drying time of the emulsions increased. Furthermore, it was observed that chitin particles destabilized the oleogels, resulting in lower oil retention in systems with chitin. Although vanillin had no significant effect on the rheology of the emulsions, its presence in the final oleogel showed a considerable improvement in the determined properties. This is attributed to the endothermic reactions between amino and aldehyde groups, which are favored by the drying temperature. In general, as the amount of chitin increases, the hardness of the oleogel decreases, while its adhesiveness and elasticity increase, reaching a maximum at a chitin concentration of 1.5 percent. Cohesiveness remains almost unchanged. In conclusion, the presence of chitin particles does not improve the quality of oleogels compared to systems without particles, but it does increase the stability of the emulsions, which is relevant at an industrial level.
One of the main objectives of the current food industry is to replace harmful fats for health with oils or fats that have similar organoleptic properties and a good nutritional profile. This master's thesis studies the development of oleogels from vegetable oils, specifically from olive oil. The emulsion is created from an aqueous phase of chitosan as a structuring agent and vanillin, which, when reacting with chitosan, forms a Schiff base and creates a three-dimensional network that traps the oil. Additionally, chitin is incorporated in the form of particles to stabilize the emulsion through the Pickering effect. Subsequently, water is removed from the emulsions through convective drying to obtain a dry emulsion, and after crushing it, the final oleogel is obtained. The main objective of this work is to study the impact of chitin particle concentration, 0.5, 1.0, 1.5, and 2.0 percent, on the properties of the emulsion and oleogel, both in the presence and absence of vanillin. The rheological properties of the emulsion were studied through oscillatory tests. First, strain sweeps were conducted to determine the linear viscoelastic regime of the emulsions, and then the mechanical spectra were measured to observe the variation of elastic and viscous modules with frequency. In the oleogel, rheological properties were analyzed through strain and frequency sweeps, along with quality tests such as colorimetry, texture, and oil retention. The rheological tests showed that the emulsions exhibited a predominantly viscous behavior at chitin concentrations lower than 1.5 percent by weight, and at higher concentrations, the behavior was inverted. The rheological behavior of the emulsions was modeled with the Cross-Williamson model for systems without chitin, and an empirical term was added based on particle concentration for systems with chitin, which allowed for the creation of a valid complex viscosity model for the tested concentration range. The droplet size distribution of the emulsions was also analyzed using optical microscopy images, showing that the droplet diameter decreased with increasing chitin concentration. Regarding the oleogels, convective drying at 70 degrees Celsius was modeled using the Page model, verifying that with increasing chitin concentration, the drying time of the emulsions increased. Furthermore, it was observed that chitin particles destabilized the oleogels, resulting in lower oil retention in systems with chitin. Although vanillin had no significant effect on the rheology of the emulsions, its presence in the final oleogel showed a considerable improvement in the determined properties. This is attributed to the endothermic reactions between amino and aldehyde groups, which are favored by the drying temperature. In general, as the amount of chitin increases, the hardness of the oleogel decreases, while its adhesiveness and elasticity increase, reaching a maximum at a chitin concentration of 1.5 percent. Cohesiveness remains almost unchanged. In conclusion, the presence of chitin particles does not improve the quality of oleogels compared to systems without particles, but it does increase the stability of the emulsions, which is relevant at an industrial level.
Direction
MOREIRA MARTINEZ, RAMON FELIPE (Tutorships)
SINEIRO TORRES, JORGE (Co-tutorships)
MOREIRA MARTINEZ, RAMON FELIPE (Tutorships)
SINEIRO TORRES, JORGE (Co-tutorships)
Court
GARRIDO FERNANDEZ, JUAN MANUEL (Chairman)
MAURICIO IGLESIAS, MIGUEL (Secretary)
GONZALEZ ALVAREZ, JULIA (Member)
GARRIDO FERNANDEZ, JUAN MANUEL (Chairman)
MAURICIO IGLESIAS, MIGUEL (Secretary)
GONZALEZ ALVAREZ, JULIA (Member)
Image Recognition on the Edge with Green Algorithms
Authorship
L.C.
University Master in Computer Vision
L.C.
University Master in Computer Vision
Defense date
02.25.2025 10:00
02.25.2025 10:00
Summary
This work focuses on optimizing object detection for edge devices using lightweight deep learning models. We address the challenges of large model size and computational overhead by applying techniques such as knowledge distillation, pruning and quantization. We present an enhanced version of YOLOv11 that uses RGB and thermal images for detection, improving efficiency and accuracy for low-power edge devices.
This work focuses on optimizing object detection for edge devices using lightweight deep learning models. We address the challenges of large model size and computational overhead by applying techniques such as knowledge distillation, pruning and quantization. We present an enhanced version of YOLOv11 that uses RGB and thermal images for detection, improving efficiency and accuracy for low-power edge devices.
Direction
BREA SANCHEZ, VICTOR MANUEL (Tutorships)
PARDO SECO, FERNANDO RAFAEL (Co-tutorships)
BREA SANCHEZ, VICTOR MANUEL (Tutorships)
PARDO SECO, FERNANDO RAFAEL (Co-tutorships)
Court
Cernadas García, Eva (Chairman)
Pardo López, Xosé Manuel (Secretary)
FLORES GONZALEZ, JULIAN CARLOS (Member)
Cernadas García, Eva (Chairman)
Pardo López, Xosé Manuel (Secretary)
FLORES GONZALEZ, JULIAN CARLOS (Member)
Quality prediction in industry: Detection of blowout defects in panels.
Authorship
A.F.G.
Master in Masive Data Analisys Tecnologies: Big Data
A.F.G.
Master in Masive Data Analisys Tecnologies: Big Data
Defense date
02.19.2025 10:00
02.19.2025 10:00
Summary
Finsa is a company that has been dedicated to the industrial transformation of wood for nearly a century, designing and manufacturing solutions for the habitat and construction sectors. This range of solutions includes decorative materials, technical materials, laminated flooring, furniture and components, as well as solid wood solutions. A key aspect of Finsa’s industrial process is the production of panels, both particleboard and fiberboard (MDF, or medium-density fiberboard). The manufacturing process is highly complex due to the physical and chemical processes involved, as well as the intricate interactions that take place throughout it. Additionally, the diverse product portfolio leads to continuous variations in production, which in turn affects the stability of the process itself. As a result, these panels may, at times, experience quality issues due to process variability, leading to the occurrence of various types of defects, such as blowouts. Ensuring compliance with technical specifications and maintaining high product quality is essential. To achieve this, various measurement devices are installed throughout the production process. Currently, in many of its production lines, Finsa employs a sensor that analyzes the panels at the press exit and detects whether a panel has suffered a blowout, in which case it is marked for disposal. However, this mechanism is not entirely reliable under certain conditions due to various operational challenges, such as contamination, calibration issues, and other factors that may cause it to be non-functional at times. Furthermore, implementing this technology in the factory is both costly and complex. For these reasons, there is a need for a predictive blowout model that functions as a 'virtual sensor' to provide additional support. To develop this model, we have access to hundreds of process signals recorded on a per-second basis. The primary objective of this Master’s Thesis is to assess the feasibility of such predictive models in addressing the blowout issue at the press exit. To this end, several approaches using different types of models will be explored. The performance metrics of each model will be computed, and the results compared to identify the model that offers the most effective solution. If the results prove favorable, the next step will be to implement the model in the factory through the development of a dashboard that will allow operators to determine whether a panel has suffered a blowout. This will enable them to take corrective actions in real time, thus minimizing the impact of the defect.
Finsa is a company that has been dedicated to the industrial transformation of wood for nearly a century, designing and manufacturing solutions for the habitat and construction sectors. This range of solutions includes decorative materials, technical materials, laminated flooring, furniture and components, as well as solid wood solutions. A key aspect of Finsa’s industrial process is the production of panels, both particleboard and fiberboard (MDF, or medium-density fiberboard). The manufacturing process is highly complex due to the physical and chemical processes involved, as well as the intricate interactions that take place throughout it. Additionally, the diverse product portfolio leads to continuous variations in production, which in turn affects the stability of the process itself. As a result, these panels may, at times, experience quality issues due to process variability, leading to the occurrence of various types of defects, such as blowouts. Ensuring compliance with technical specifications and maintaining high product quality is essential. To achieve this, various measurement devices are installed throughout the production process. Currently, in many of its production lines, Finsa employs a sensor that analyzes the panels at the press exit and detects whether a panel has suffered a blowout, in which case it is marked for disposal. However, this mechanism is not entirely reliable under certain conditions due to various operational challenges, such as contamination, calibration issues, and other factors that may cause it to be non-functional at times. Furthermore, implementing this technology in the factory is both costly and complex. For these reasons, there is a need for a predictive blowout model that functions as a 'virtual sensor' to provide additional support. To develop this model, we have access to hundreds of process signals recorded on a per-second basis. The primary objective of this Master’s Thesis is to assess the feasibility of such predictive models in addressing the blowout issue at the press exit. To this end, several approaches using different types of models will be explored. The performance metrics of each model will be computed, and the results compared to identify the model that offers the most effective solution. If the results prove favorable, the next step will be to implement the model in the factory through the development of a dashboard that will allow operators to determine whether a panel has suffered a blowout. This will enable them to take corrective actions in real time, thus minimizing the impact of the defect.
Direction
MERA PEREZ, DAVID (Tutorships)
CONDE LEBORAN, IVAN (Co-tutorships)
MERA PEREZ, DAVID (Tutorships)
CONDE LEBORAN, IVAN (Co-tutorships)
Court
VIDAL AGUIAR, JUAN CARLOS (Chairman)
Triñanes Fernández, Joaquín Ángel (Secretary)
AMEIJEIRAS ALONSO, JOSE (Member)
VIDAL AGUIAR, JUAN CARLOS (Chairman)
Triñanes Fernández, Joaquín Ángel (Secretary)
AMEIJEIRAS ALONSO, JOSE (Member)
Validation of design and scaling heuristics integrating environmental externalities
Authorship
A.P.C.
Master in Chemical and Bioprocess Engineering
A.P.C.
Master in Chemical and Bioprocess Engineering
Defense date
02.19.2025 10:45
02.19.2025 10:45
Summary
Early stage evaluations are of interest in validating the current reliability of design heuristics in chemical engineering by anticipating the technical, economic and environmental viability of a project from a holistic perspective. Their development depends on the maturity of a technology, so this document addresses their application to two representative case studies. Initially, the suitability of the typical factor between the reflux ratio and its minimum value is tested in a distillation column with a double analysis in a large-scale azeotropic system in a tray unit, and packed contact in a typical production in the fine chemicals sector. In a second evaluation, the most suitable scale up methods are reviewed for a hypothetical industrial plant of polyhydroxyalkanoates with mixed microbial cultures and a residual organic substrate based on pilot experimental records. The results obtained in the mature technology demonstrated a favourable balance between separation efficiency and operating costs at lower reflux ratios, addressing a higher initial investment to prioritise a reduction in emissions and process costs. Regarding the case study of the emerging system, the process engineering scale up heuristics were found to be optimal in terms of applicability and uncertainty, while the economic and carbon footprint analysis pointed to acidification conversion, intracellular product extraction and raw material acquisition costs as the main limitations for its introduction into the market.
Early stage evaluations are of interest in validating the current reliability of design heuristics in chemical engineering by anticipating the technical, economic and environmental viability of a project from a holistic perspective. Their development depends on the maturity of a technology, so this document addresses their application to two representative case studies. Initially, the suitability of the typical factor between the reflux ratio and its minimum value is tested in a distillation column with a double analysis in a large-scale azeotropic system in a tray unit, and packed contact in a typical production in the fine chemicals sector. In a second evaluation, the most suitable scale up methods are reviewed for a hypothetical industrial plant of polyhydroxyalkanoates with mixed microbial cultures and a residual organic substrate based on pilot experimental records. The results obtained in the mature technology demonstrated a favourable balance between separation efficiency and operating costs at lower reflux ratios, addressing a higher initial investment to prioritise a reduction in emissions and process costs. Regarding the case study of the emerging system, the process engineering scale up heuristics were found to be optimal in terms of applicability and uncertainty, while the economic and carbon footprint analysis pointed to acidification conversion, intracellular product extraction and raw material acquisition costs as the main limitations for its introduction into the market.
Direction
HOSPIDO QUINTANA, ALMUDENA (Tutorships)
MAURICIO IGLESIAS, MIGUEL (Co-tutorships)
HOSPIDO QUINTANA, ALMUDENA (Tutorships)
MAURICIO IGLESIAS, MIGUEL (Co-tutorships)
Court
GONZALEZ ALVAREZ, JULIA (Chairman)
FRANCO RUIZ, DANIEL JOSE (Secretary)
GONZALEZ GARCIA, SARA (Member)
GONZALEZ ALVAREZ, JULIA (Chairman)
FRANCO RUIZ, DANIEL JOSE (Secretary)
GONZALEZ GARCIA, SARA (Member)
Impact of the healthy control database on brain PET imaging quantification
Authorship
E.F.Z.
University Master in Computer Vision
E.F.Z.
University Master in Computer Vision
Defense date
02.03.2025 11:00
02.03.2025 11:00
Summary
Positron Emission Tomography (PET) is crucial in neurological research and clinical practice, providing information about brain metabolism, especially through the use of [18F]FDG PET, which aids in the diagnosis of conditions such as epilepsy. While quantitative PET analysis can improve diagnostic precision, its results may be influenced by the selection of the healthy control (HC) dataset. This study uses SimPET-generated synthetic data to explore how healthy control database acquired in six different scanners impact quantitative PET analysis, aiming to improve the robustness of clinical quantification methods. To assess the impact of scanner variability, both VOI-based and voxel-based analyses were performed, comparing a synthetic database of epilepsy patients with the different synthetic HC databases. The results show that scanner variability significantly affected the detection of both focal and extended epileptic lesions. These findings highlight the importance of the data harmonization to ensure reliable and accurate PET quantification. In conclusion, while the choice of a locally acquired healthy control database is important, its impact on PET quantification could be limited when comparing scanners of the same generation.
Positron Emission Tomography (PET) is crucial in neurological research and clinical practice, providing information about brain metabolism, especially through the use of [18F]FDG PET, which aids in the diagnosis of conditions such as epilepsy. While quantitative PET analysis can improve diagnostic precision, its results may be influenced by the selection of the healthy control (HC) dataset. This study uses SimPET-generated synthetic data to explore how healthy control database acquired in six different scanners impact quantitative PET analysis, aiming to improve the robustness of clinical quantification methods. To assess the impact of scanner variability, both VOI-based and voxel-based analyses were performed, comparing a synthetic database of epilepsy patients with the different synthetic HC databases. The results show that scanner variability significantly affected the detection of both focal and extended epileptic lesions. These findings highlight the importance of the data harmonization to ensure reliable and accurate PET quantification. In conclusion, while the choice of a locally acquired healthy control database is important, its impact on PET quantification could be limited when comparing scanners of the same generation.
Direction
BREA SANCHEZ, VICTOR MANUEL (Tutorships)
Aguiar Fernández, Pablo (Co-tutorships)
BREA SANCHEZ, VICTOR MANUEL (Tutorships)
Aguiar Fernández, Pablo (Co-tutorships)
Court
Cernadas García, Eva (Chairman)
Pardo López, Xosé Manuel (Secretary)
FLORES GONZALEZ, JULIAN CARLOS (Member)
Cernadas García, Eva (Chairman)
Pardo López, Xosé Manuel (Secretary)
FLORES GONZALEZ, JULIAN CARLOS (Member)
Automation and Optimization of Prediction Algorithms for IoT Devices in the Electricity Sector: A Scalable Approach
Authorship
B.G.M.
Master in artificial intelligence
B.G.M.
Master in artificial intelligence
Defense date
02.21.2025 11:00
02.21.2025 11:00
Summary
The increasing amount of data coming from IoT devices affects all sectors of business. Therefore, the electricity sector is not the exception. The rapid evolution of AI has enhanced the automation and optimization of data processing, enabling more efficient handling of large-scale datasets in predictive modeling. In this article, a tool is presented, which efficiently automates time series predictions based on historical data of electrical measurements, using a set of Deep Learning algorithms, with their respective validation and subsequent analysis of the results. These results illustrate the effectiveness of the AI algorithms in generating accurate forecasts. By establishing a scalable and automated framework for model training and evaluation, this work contributes to the ongoing advancement of AI technologies in the electrical sector, paving the way for more efficient management of resources in this sector.
The increasing amount of data coming from IoT devices affects all sectors of business. Therefore, the electricity sector is not the exception. The rapid evolution of AI has enhanced the automation and optimization of data processing, enabling more efficient handling of large-scale datasets in predictive modeling. In this article, a tool is presented, which efficiently automates time series predictions based on historical data of electrical measurements, using a set of Deep Learning algorithms, with their respective validation and subsequent analysis of the results. These results illustrate the effectiveness of the AI algorithms in generating accurate forecasts. By establishing a scalable and automated framework for model training and evaluation, this work contributes to the ongoing advancement of AI technologies in the electrical sector, paving the way for more efficient management of resources in this sector.
Direction
CONDORI FERNANDEZ, OLINDA NELLY (Tutorships)
Franco Romero, Carlos (Co-tutorships)
Dacosta Tapia, Jorge (Co-tutorships)
CONDORI FERNANDEZ, OLINDA NELLY (Tutorships)
Franco Romero, Carlos (Co-tutorships)
Dacosta Tapia, Jorge (Co-tutorships)
Court
Gamallo Otero, Pablo (Chairman)
CORES COSTA, DANIEL (Secretary)
CHAVES FRAGA, DAVID (Member)
Gamallo Otero, Pablo (Chairman)
CORES COSTA, DANIEL (Secretary)
CHAVES FRAGA, DAVID (Member)
Agent-based Virtual Assistant for Clinical Decision Support Systems
Authorship
A.G.L.
Master in artificial intelligence
A.G.L.
Master in artificial intelligence
Defense date
02.21.2025 11:30
02.21.2025 11:30
Summary
Clinical Decision Support Systems (CDSSs) play a crucial role in modern healthcare by assisting clinicians in diagnosis, risk assessment, and treatment recommendations. This study presents the development of an Agent-Based Virtual Assistant for CDSS, integrating Large Language Model agents and Retrieval-Augmented Generation to improve decision-making in the management of Pulmonary Embolism (PE). The proposed system is designed to offer four core services: a Standard Response Service for general queries and guideline-based information retrieval, a Patient Data Service for structured patient data access, a Metrics Service for the calculation of PE-related risk scores, and a Recommendation Service for generating personalized clinical recommendations. The system was evaluated through both quantitative and qualitative methodologies, incorporating empirical performance assessments and expert medical validation. Experimental results demonstrated high accuracy and reliability across services, with the Standard Response Service achieving strong semantic alignment (ASS: 89.50\%) and factual consistency (F: 97.50\%). The Metrics Service exhibited 100\% accuracy in classifying PE severity levels. The Recommendation Service, despite the inherent subjectivity in clinical recommendations, showed reasonable alignment with physician indications. Additionally, expert evaluation confirmed that the system’s reasoning process closely mirrors clinical logic, successfully identifying key clinical considerations and enhancing guideline consultation efficiency.
Clinical Decision Support Systems (CDSSs) play a crucial role in modern healthcare by assisting clinicians in diagnosis, risk assessment, and treatment recommendations. This study presents the development of an Agent-Based Virtual Assistant for CDSS, integrating Large Language Model agents and Retrieval-Augmented Generation to improve decision-making in the management of Pulmonary Embolism (PE). The proposed system is designed to offer four core services: a Standard Response Service for general queries and guideline-based information retrieval, a Patient Data Service for structured patient data access, a Metrics Service for the calculation of PE-related risk scores, and a Recommendation Service for generating personalized clinical recommendations. The system was evaluated through both quantitative and qualitative methodologies, incorporating empirical performance assessments and expert medical validation. Experimental results demonstrated high accuracy and reliability across services, with the Standard Response Service achieving strong semantic alignment (ASS: 89.50\%) and factual consistency (F: 97.50\%). The Metrics Service exhibited 100\% accuracy in classifying PE severity levels. The Recommendation Service, despite the inherent subjectivity in clinical recommendations, showed reasonable alignment with physician indications. Additionally, expert evaluation confirmed that the system’s reasoning process closely mirrors clinical logic, successfully identifying key clinical considerations and enhancing guideline consultation efficiency.
Direction
MERA PEREZ, DAVID (Tutorships)
Piñeiro Martín, Andrés (Co-tutorships)
López Pérez, María Carmen (Co-tutorships)
MERA PEREZ, DAVID (Tutorships)
Piñeiro Martín, Andrés (Co-tutorships)
López Pérez, María Carmen (Co-tutorships)
Court
Gamallo Otero, Pablo (Chairman)
CORES COSTA, DANIEL (Secretary)
CHAVES FRAGA, DAVID (Member)
Gamallo Otero, Pablo (Chairman)
CORES COSTA, DANIEL (Secretary)
CHAVES FRAGA, DAVID (Member)
Towards Autonomous Web Navigation with LLM-based Agents
Authorship
M.I.A.
Master in artificial intelligence
M.I.A.
Master in artificial intelligence
Defense date
02.21.2025 12:00
02.21.2025 12:00
Summary
Autonomous web navigation remains a complex challenge, particularly due to the dynamic, diverse and unstructured nature of web environments. Traditional web scraping techniques, while effective, require rigid configurations tied to specific website structures, limiting their generalizability. To address these challenges, this work explores the usage of autonomous agents powered by Large Language Models for autonomous web navigation, focusing on the retrieval of academic publications from webs of preprint repositories. The proposed solution, based on hyperlink exploration, is designed as a component of a potentially broader system for AI-driven paper search assistance. It leverages a multi-agent architecture and a structured tree-traversal like approach to explore and extract relevant documents. Each agent is assigned a specific role, including relevant URL extraction, document collection, planning, presentation and quality control. The system is implemented using AutoGen, which enables flexible agent interactions and modular design. Unlike traditional web information extraction techniques, this approach generalizes navigation patterns across different websites without relying on predefined HTML selectors, allowing its usage on different websites. Experimental results are promising, demonstrating the system's effectiveness in retrieving relevant academic content. However, challenges such as increased response times and occasional hallucinations indicate areas for refinement. Future work aims to enhance interactivity by integrating advanced form-based search capabilities, optimize retrieval efficiency, and implement more robust evaluation frameworks. These improvements could contribute to fully automated AI-driven web exploration, facilitating the development of more generalizable autonomous web navigation tools.
Autonomous web navigation remains a complex challenge, particularly due to the dynamic, diverse and unstructured nature of web environments. Traditional web scraping techniques, while effective, require rigid configurations tied to specific website structures, limiting their generalizability. To address these challenges, this work explores the usage of autonomous agents powered by Large Language Models for autonomous web navigation, focusing on the retrieval of academic publications from webs of preprint repositories. The proposed solution, based on hyperlink exploration, is designed as a component of a potentially broader system for AI-driven paper search assistance. It leverages a multi-agent architecture and a structured tree-traversal like approach to explore and extract relevant documents. Each agent is assigned a specific role, including relevant URL extraction, document collection, planning, presentation and quality control. The system is implemented using AutoGen, which enables flexible agent interactions and modular design. Unlike traditional web information extraction techniques, this approach generalizes navigation patterns across different websites without relying on predefined HTML selectors, allowing its usage on different websites. Experimental results are promising, demonstrating the system's effectiveness in retrieving relevant academic content. However, challenges such as increased response times and occasional hallucinations indicate areas for refinement. Future work aims to enhance interactivity by integrating advanced form-based search capabilities, optimize retrieval efficiency, and implement more robust evaluation frameworks. These improvements could contribute to fully automated AI-driven web exploration, facilitating the development of more generalizable autonomous web navigation tools.
Direction
FERNANDEZ PICHEL, MARCOS (Tutorships)
Morón Reyes, Francisco José (Co-tutorships)
FERNANDEZ PICHEL, MARCOS (Tutorships)
Morón Reyes, Francisco José (Co-tutorships)
Court
Gamallo Otero, Pablo (Chairman)
CORES COSTA, DANIEL (Secretary)
CHAVES FRAGA, DAVID (Member)
Gamallo Otero, Pablo (Chairman)
CORES COSTA, DANIEL (Secretary)
CHAVES FRAGA, DAVID (Member)
CRASHED: Consistency-Aware Retrieval Augmented Generation System for High Efficiency Document Retrieval
Authorship
M.J.
Master in artificial intelligence
M.J.
Master in artificial intelligence
Defense date
02.21.2025 12:30
02.21.2025 12:30
Summary
In this work, we introduce CRASHED, a Retrieval-Augmented Generation (RAG) system that leverages the best of vector databases and large language models to obtain both correct and contextually appropriate response. Our system combines a database of document embeddings with a large language model to produce responses. Using hybrid search with reranking methods, CRASHED selects the most pertinent document chunks and uses them to produce answers that are informed by external knowledge. We introduce a conversational agent interface built on LangChain in order to allow users to communicate with the system in a fluent and natural language. Our qualitative evaluation confirms that CRASHED effectively retrieves and incorporates relevant passages, producing factually accurate and contextually relevant responses. This project demonstrates the potential of RAG systems in generating high-quality responses and paves the way for further research and development in this area.
In this work, we introduce CRASHED, a Retrieval-Augmented Generation (RAG) system that leverages the best of vector databases and large language models to obtain both correct and contextually appropriate response. Our system combines a database of document embeddings with a large language model to produce responses. Using hybrid search with reranking methods, CRASHED selects the most pertinent document chunks and uses them to produce answers that are informed by external knowledge. We introduce a conversational agent interface built on LangChain in order to allow users to communicate with the system in a fluent and natural language. Our qualitative evaluation confirms that CRASHED effectively retrieves and incorporates relevant passages, producing factually accurate and contextually relevant responses. This project demonstrates the potential of RAG systems in generating high-quality responses and paves the way for further research and development in this area.
Direction
FERNANDEZ PICHEL, MARCOS (Tutorships)
Fontenla Seco, Yago (Co-tutorships)
FERNANDEZ PICHEL, MARCOS (Tutorships)
Fontenla Seco, Yago (Co-tutorships)
Court
Gamallo Otero, Pablo (Chairman)
CORES COSTA, DANIEL (Secretary)
CHAVES FRAGA, DAVID (Member)
Gamallo Otero, Pablo (Chairman)
CORES COSTA, DANIEL (Secretary)
CHAVES FRAGA, DAVID (Member)
Deepfake Detection: A Comparative Study of a Novel Dataset Against State-of-the-Art Benchmarks
Authorship
A.M.O.
Master in artificial intelligence
A.M.O.
Master in artificial intelligence
Defense date
02.20.2025 10:00
02.20.2025 10:00
Summary
The study carried out in this report finds its motivation in the development of an AI-based system to detect AI-generated videos. Generative models are becoming increasingly powerful and accessible, which may pose a threat to the general population, since these videos have the capability of deceiving individuals and spread misinformation. The aim of this study is to deploy a first approach to combat these threats by detecting whether a given video is AI-generated or real. We will test our model on different scenarios and draw conclusions to improve out future work on this topic.
The study carried out in this report finds its motivation in the development of an AI-based system to detect AI-generated videos. Generative models are becoming increasingly powerful and accessible, which may pose a threat to the general population, since these videos have the capability of deceiving individuals and spread misinformation. The aim of this study is to deploy a first approach to combat these threats by detecting whether a given video is AI-generated or real. We will test our model on different scenarios and draw conclusions to improve out future work on this topic.
Direction
TABOADA IGLESIAS, MARÍA JESÚS (Tutorships)
Segura Muros, José Ángel (Co-tutorships)
Román Sarmiento, Raquel (Co-tutorships)
TABOADA IGLESIAS, MARÍA JESÚS (Tutorships)
Segura Muros, José Ángel (Co-tutorships)
Román Sarmiento, Raquel (Co-tutorships)
Court
GARCIA TAHOCES, PABLO (Chairman)
IGLESIAS RODRIGUEZ, ROBERTO (Secretary)
CELARD PEREZ, PEDRO (Member)
GARCIA TAHOCES, PABLO (Chairman)
IGLESIAS RODRIGUEZ, ROBERTO (Secretary)
CELARD PEREZ, PEDRO (Member)
Development of a Conversational Agent Based on Large Language Models for SQL Querying
Authorship
V.I.D.L.M.O.P.
Master in artificial intelligence
V.I.D.L.M.O.P.
Master in artificial intelligence
Defense date
02.20.2025 10:30
02.20.2025 10:30
Summary
The increasing reliance on data-driven decision-making in modern organizations has heightened the demand for intuitive and accessible database querying solutions. However, traditional SQL-based interactions remain a barrier for non-technical users, limiting their ability to retrieve meaningful insights from structured data. This thesis presents the development of a Conversational SQL Agent, an intelligent system leveraging Large Language Models (LLMs) to enable natural language interaction with relational databases. The proposed approach utilizes AutoGen, a framework for LLM-based agents, to construct a multi-agent system that autonomously interprets, validates, and executes SQL queries based on user input. Key innovations include a Planner Agent that supervises query execution and a Database Information Agent that enhances schema understanding. The system is benchmarked against LangChain’s SQL Agent using a subset of the Spider benchmark, demonstrating a 57% improvement in execution accuracy. Additionally, it introduces mechanisms for handling ambiguity, correcting user errors, and iteratively refining queries to ensure correctness. The findings highlight the potential of LLM-driven agents in democratizing data access, making structured querying more accessible to users without SQL expertise.
The increasing reliance on data-driven decision-making in modern organizations has heightened the demand for intuitive and accessible database querying solutions. However, traditional SQL-based interactions remain a barrier for non-technical users, limiting their ability to retrieve meaningful insights from structured data. This thesis presents the development of a Conversational SQL Agent, an intelligent system leveraging Large Language Models (LLMs) to enable natural language interaction with relational databases. The proposed approach utilizes AutoGen, a framework for LLM-based agents, to construct a multi-agent system that autonomously interprets, validates, and executes SQL queries based on user input. Key innovations include a Planner Agent that supervises query execution and a Database Information Agent that enhances schema understanding. The system is benchmarked against LangChain’s SQL Agent using a subset of the Spider benchmark, demonstrating a 57% improvement in execution accuracy. Additionally, it introduces mechanisms for handling ambiguity, correcting user errors, and iteratively refining queries to ensure correctness. The findings highlight the potential of LLM-driven agents in democratizing data access, making structured querying more accessible to users without SQL expertise.
Direction
FERNANDEZ PICHEL, MARCOS (Tutorships)
RAMA MANEIRO, EFREN (Co-tutorships)
FERNANDEZ PICHEL, MARCOS (Tutorships)
RAMA MANEIRO, EFREN (Co-tutorships)
Court
GARCIA TAHOCES, PABLO (Chairman)
IGLESIAS RODRIGUEZ, ROBERTO (Secretary)
CELARD PEREZ, PEDRO (Member)
GARCIA TAHOCES, PABLO (Chairman)
IGLESIAS RODRIGUEZ, ROBERTO (Secretary)
CELARD PEREZ, PEDRO (Member)
Integrating LLMs and fuzzy rule-based systems for generating trustworthy explanations
Authorship
P.M.P.F.
Master in artificial intelligence
P.M.P.F.
Master in artificial intelligence
Defense date
02.20.2025 11:00
02.20.2025 11:00
Summary
In this work, we have developed and validated fuzzy-grounded, trustworthy, natural language explanations by the integration of Large Language Models (LLMs) with fuzzy rule based systems. We profit from fuzzy logic’s proficiency when dealing with the inherent vagueness and uncertainty of human language, and propose an effective way to combine fuzzy rule inference with the powerful verbalization capabilities of LLMs. Our aim is for the explanations to balance naturalness and faithfulness to the underlying data. The proposed software architecture is open-source, modular and easy to transfer between contexts and application domains. We used the medical field as a proof of concept, being a domain where trustworthiness is critical. We observed significant improvement in our hallucination-detection metric for the experimental setups under study, pointing at better faithfulness; and our architecture also produced significantly more compact narratives, avoiding the usual ’over explaining’ issue of LLMs which may damage their naturalness, without a loss in predictive accuracy.
In this work, we have developed and validated fuzzy-grounded, trustworthy, natural language explanations by the integration of Large Language Models (LLMs) with fuzzy rule based systems. We profit from fuzzy logic’s proficiency when dealing with the inherent vagueness and uncertainty of human language, and propose an effective way to combine fuzzy rule inference with the powerful verbalization capabilities of LLMs. Our aim is for the explanations to balance naturalness and faithfulness to the underlying data. The proposed software architecture is open-source, modular and easy to transfer between contexts and application domains. We used the medical field as a proof of concept, being a domain where trustworthiness is critical. We observed significant improvement in our hallucination-detection metric for the experimental setups under study, pointing at better faithfulness; and our architecture also produced significantly more compact narratives, avoiding the usual ’over explaining’ issue of LLMs which may damage their naturalness, without a loss in predictive accuracy.
Direction
ALONSO MORAL, JOSE MARIA (Tutorships)
CATALA BOLOS, ALEJANDRO (Co-tutorships)
ALONSO MORAL, JOSE MARIA (Tutorships)
CATALA BOLOS, ALEJANDRO (Co-tutorships)
Court
GARCIA TAHOCES, PABLO (Chairman)
IGLESIAS RODRIGUEZ, ROBERTO (Secretary)
CELARD PEREZ, PEDRO (Member)
GARCIA TAHOCES, PABLO (Chairman)
IGLESIAS RODRIGUEZ, ROBERTO (Secretary)
CELARD PEREZ, PEDRO (Member)
Mestrado en Optimización da Xestión de Carteiras de Proxectos
Authorship
E.P.V.
Master in artificial intelligence
E.P.V.
Master in artificial intelligence
Defense date
02.20.2025 10:00
02.20.2025 10:00
Summary
Este TFM ten a meta de desenvolver e implementar en produción unha ferramenta mellorada de xestión de carteiras con aprendizaxe automática (ML). Trátase dun proxecto levado a cabo tanto na Axencia de Goberno Electrónico e Sociedade da Información e do Coñecemento (AGESIC) como nas Nacións Unidas, en resposta á crecente demanda de sistemas de apoio á toma de decisións baseados en ML. O principal problema detectado é a falta de ferramentas que faciliten a toma de decisións con ML, o que reduce a eficiencia na xestión de carteiras e na selección de proxectos. Esta carencia afecta tanto a Uruguai como a organismos financeiros internacionais, incluídas axencias de crédito e o Banco Interamericano de Desenvolvemento (BID). Unha mala xestión das carteiras pode impactar a distribución do financiamento, a cualificación crediticia de AGESIC e a prestación de servizos en xeral. A metodoloxía segue un modelo xerárquico baseado en intelixencia artificial e abrangue o desenvolvemento integral da IA, a IA para a xestión de datos (DAMA), a IA para a xestión de proxectos, a enxeñaría de software e TOGAF IA para a gobernanza a nivel estratéxico. O conxunto de datos empregado contén 183.281 rexistros e 362 variables, organizadas en tres niveis para optimizar o procesamento da información. EDA FULL é a ferramenta principal de análise e inclúe datos de data e hora, valores numéricos, enteiros, categóricos e texto, ocupando 506.2 MB de espazo en disco. SUB EDA segmenta os datos en subconxuntos temáticos máis específicos, como Riscos, Leccións Aprendidas, Programas, Entregables, Orzamento, Outros Datos e Información Non Relevante, con 71.383 rexistros en cada un. EDA MODELS emprega técnicas estatísticas para extraer información e desenvolver modelos predictivos, incluíndo EDA Regression e Random Forest Model (37.050 rexistros, 16 variables), EDA Time Series Models (29.704 rexistros, 8 variables) e EDA Recommendation System (9.564 rexistros, 4 variables). Durante o análise exploratorio de datos (EDA) desenvolvéronse modelos como Regresión Loxística e un modelo conxunto con Random Forest para clasificación. Para a predición de series temporais, probáronse e empregáronse algoritmos avanzados como Simple RNN, LSTM, GRU e Bidirectional RNN. No sistema de recomendación, avaliáronse diferentes algoritmos, como NormalPredictor, BaselineOnly e KNNWithZScore, con métricas de similitude como coseno, MSD e Pearson, ademais de SVD. Os resultados obtidos amosan unha precisión do 86,48% nas probas de clasificación con Regresión Loxística, un MAE de 0,0001 na predición do orzamento con Prophet e un sistema de recomendación baseado en SVD cun MAE de 0,7748 nas probas. Os tres modelos superaron as probas de aceptación, o que confirma que son efectivos, eficientes e fiables para mellorar a xestión de carteiras. Finalmente, clasificada como unha innovación de proceso forte segundo o Manual de Oslo, esta metodoloxía maximiza a eficiencia e introduce unha nova perspectiva na toma de decisións baseada en IA.
Este TFM ten a meta de desenvolver e implementar en produción unha ferramenta mellorada de xestión de carteiras con aprendizaxe automática (ML). Trátase dun proxecto levado a cabo tanto na Axencia de Goberno Electrónico e Sociedade da Información e do Coñecemento (AGESIC) como nas Nacións Unidas, en resposta á crecente demanda de sistemas de apoio á toma de decisións baseados en ML. O principal problema detectado é a falta de ferramentas que faciliten a toma de decisións con ML, o que reduce a eficiencia na xestión de carteiras e na selección de proxectos. Esta carencia afecta tanto a Uruguai como a organismos financeiros internacionais, incluídas axencias de crédito e o Banco Interamericano de Desenvolvemento (BID). Unha mala xestión das carteiras pode impactar a distribución do financiamento, a cualificación crediticia de AGESIC e a prestación de servizos en xeral. A metodoloxía segue un modelo xerárquico baseado en intelixencia artificial e abrangue o desenvolvemento integral da IA, a IA para a xestión de datos (DAMA), a IA para a xestión de proxectos, a enxeñaría de software e TOGAF IA para a gobernanza a nivel estratéxico. O conxunto de datos empregado contén 183.281 rexistros e 362 variables, organizadas en tres niveis para optimizar o procesamento da información. EDA FULL é a ferramenta principal de análise e inclúe datos de data e hora, valores numéricos, enteiros, categóricos e texto, ocupando 506.2 MB de espazo en disco. SUB EDA segmenta os datos en subconxuntos temáticos máis específicos, como Riscos, Leccións Aprendidas, Programas, Entregables, Orzamento, Outros Datos e Información Non Relevante, con 71.383 rexistros en cada un. EDA MODELS emprega técnicas estatísticas para extraer información e desenvolver modelos predictivos, incluíndo EDA Regression e Random Forest Model (37.050 rexistros, 16 variables), EDA Time Series Models (29.704 rexistros, 8 variables) e EDA Recommendation System (9.564 rexistros, 4 variables). Durante o análise exploratorio de datos (EDA) desenvolvéronse modelos como Regresión Loxística e un modelo conxunto con Random Forest para clasificación. Para a predición de series temporais, probáronse e empregáronse algoritmos avanzados como Simple RNN, LSTM, GRU e Bidirectional RNN. No sistema de recomendación, avaliáronse diferentes algoritmos, como NormalPredictor, BaselineOnly e KNNWithZScore, con métricas de similitude como coseno, MSD e Pearson, ademais de SVD. Os resultados obtidos amosan unha precisión do 86,48% nas probas de clasificación con Regresión Loxística, un MAE de 0,0001 na predición do orzamento con Prophet e un sistema de recomendación baseado en SVD cun MAE de 0,7748 nas probas. Os tres modelos superaron as probas de aceptación, o que confirma que son efectivos, eficientes e fiables para mellorar a xestión de carteiras. Finalmente, clasificada como unha innovación de proceso forte segundo o Manual de Oslo, esta metodoloxía maximiza a eficiencia e introduce unha nova perspectiva na toma de decisións baseada en IA.
Direction
Cotos Yáñez, José Manuel (Tutorships)
MERA PEREZ, DAVID (Co-tutorships)
Mato Fernández, Daniel (Co-tutorships)
Cotos Yáñez, José Manuel (Tutorships)
MERA PEREZ, DAVID (Co-tutorships)
Mato Fernández, Daniel (Co-tutorships)
Court
TABOADA IGLESIAS, MARÍA JESÚS (Chairman)
Pardo López, Xosé Manuel (Secretary)
VILA BLANCO, NICOLAS (Member)
TABOADA IGLESIAS, MARÍA JESÚS (Chairman)
Pardo López, Xosé Manuel (Secretary)
VILA BLANCO, NICOLAS (Member)
Deployment of Deep Learning Networks in the Edge using Compression Techniques.
Authorship
M.R.R.
Master in artificial intelligence
M.R.R.
Master in artificial intelligence
Defense date
02.20.2025 10:30
02.20.2025 10:30
Summary
In recent years, deep learning models have grown exponentially in complexity and size, making their deployment on resource-constrained edge devices a significant challenge. This TFM/MSc explores model compression techniques, including quantization, pruning, and knowledge distillation, to optimize deep neural networks for edge computing while maintaining high accuracy. The proposed methods are applied to a ResNet-50 model trained on the CIFAR-10 dataset, evaluating their impact on model size, inference speed, and accuracy. The optimized mod- els are deployed on the ZCU104 FPGA using Vitis AI, demon- strating the feasibility of running efficient deep learning models on hardware with limited computational resources. Experimental results show that a combination of compression techniques can achieve up to 95% model size reduction while preserving or even improving accuracy. These findings highlight the potential of model compression for enabling real-time inference on edge devices, reducing latency, and improving energy efficiency.
In recent years, deep learning models have grown exponentially in complexity and size, making their deployment on resource-constrained edge devices a significant challenge. This TFM/MSc explores model compression techniques, including quantization, pruning, and knowledge distillation, to optimize deep neural networks for edge computing while maintaining high accuracy. The proposed methods are applied to a ResNet-50 model trained on the CIFAR-10 dataset, evaluating their impact on model size, inference speed, and accuracy. The optimized mod- els are deployed on the ZCU104 FPGA using Vitis AI, demon- strating the feasibility of running efficient deep learning models on hardware with limited computational resources. Experimental results show that a combination of compression techniques can achieve up to 95% model size reduction while preserving or even improving accuracy. These findings highlight the potential of model compression for enabling real-time inference on edge devices, reducing latency, and improving energy efficiency.
Direction
BREA SANCHEZ, VICTOR MANUEL (Tutorships)
PARDO SECO, FERNANDO RAFAEL (Co-tutorships)
BREA SANCHEZ, VICTOR MANUEL (Tutorships)
PARDO SECO, FERNANDO RAFAEL (Co-tutorships)
Court
TABOADA IGLESIAS, MARÍA JESÚS (Chairman)
Pardo López, Xosé Manuel (Secretary)
VILA BLANCO, NICOLAS (Member)
TABOADA IGLESIAS, MARÍA JESÚS (Chairman)
Pardo López, Xosé Manuel (Secretary)
VILA BLANCO, NICOLAS (Member)