Removal of Emerging Contaminants (Microcontaminants, Antibiotic Resistances, and Pathogens) Through Anaerobic Biological Processes: State-of-the-Art Review and Experimental Study Proposal
Authorship
A.F.S.
Master in Environmental Engineering (3rd ed)
A.F.S.
Master in Environmental Engineering (3rd ed)
Defense date
02.27.2024 12:00
02.27.2024 12:00
Summary
Emerging Contaminants of Concern (CECs) have gained significant relevance in the field of environmental engineering, notably due to their introduction into aquatic environments through effluents from wastewater treatment plants. Among the CECs present in these urban wastewater streams, antibiotics, pathogenic microorganisms, and the resistances developed by these microorganisms, transmitted via antibiotic resistance genes (ARGs), are notable. Domestic wastewater constitutes a significant source of CECs. However, commonly employed individual wastewater management systems in dispersed population areas are insufficient to address both conventional and emerging contaminants adequately. Therefore, the development of economically efficient technologies is imperative to tackle this challenge. In this study, a comprehensive review of the state-of-the-art regarding available technologies for CEC removal in wastewater was conducted, along with an evaluation of the efficacy of treatment methods for the removal of these contaminants. Additionally, an experimental study was designed to assess the effectiveness of an anaerobic process for CEC removal in segregated wastewater. The state-of-the-art review was based on a keyword search strategy focusing on relevant terms such as organic microcontaminants, antibiotics, antibiotic resistance, pathogens, wastewater treatment plants, and decentralized treatment. On the other hand, the experimental study design involved the initiation and optimization of an Upflow Anaerobic Sludge Blanket (UASB) reactor for the treatment of wastewater from septic tanks. In conclusion, a selection of microcontaminants in general, and antibiotics in particular (sulfamethoxazole, trimethoprim, ciprofloxacin, and erythromycin), as well as ARGs (intl1, sul1, blaCTX-M, mcr, ermB, and qnrS), and pathogens (Klebsiella, Enterococcus, norovirus, sapovirus, and hepatitis virus) was made for analysis. Regarding the experimental setup, a wastewater conservation system was developed, achieving a 79% removal in the UASB reactor in terms of Chemical Oxygen Demand (COD).
Emerging Contaminants of Concern (CECs) have gained significant relevance in the field of environmental engineering, notably due to their introduction into aquatic environments through effluents from wastewater treatment plants. Among the CECs present in these urban wastewater streams, antibiotics, pathogenic microorganisms, and the resistances developed by these microorganisms, transmitted via antibiotic resistance genes (ARGs), are notable. Domestic wastewater constitutes a significant source of CECs. However, commonly employed individual wastewater management systems in dispersed population areas are insufficient to address both conventional and emerging contaminants adequately. Therefore, the development of economically efficient technologies is imperative to tackle this challenge. In this study, a comprehensive review of the state-of-the-art regarding available technologies for CEC removal in wastewater was conducted, along with an evaluation of the efficacy of treatment methods for the removal of these contaminants. Additionally, an experimental study was designed to assess the effectiveness of an anaerobic process for CEC removal in segregated wastewater. The state-of-the-art review was based on a keyword search strategy focusing on relevant terms such as organic microcontaminants, antibiotics, antibiotic resistance, pathogens, wastewater treatment plants, and decentralized treatment. On the other hand, the experimental study design involved the initiation and optimization of an Upflow Anaerobic Sludge Blanket (UASB) reactor for the treatment of wastewater from septic tanks. In conclusion, a selection of microcontaminants in general, and antibiotics in particular (sulfamethoxazole, trimethoprim, ciprofloxacin, and erythromycin), as well as ARGs (intl1, sul1, blaCTX-M, mcr, ermB, and qnrS), and pathogens (Klebsiella, Enterococcus, norovirus, sapovirus, and hepatitis virus) was made for analysis. Regarding the experimental setup, a wastewater conservation system was developed, achieving a 79% removal in the UASB reactor in terms of Chemical Oxygen Demand (COD).
Direction
Omil Prieto, Francisco (Tutorships)
GARRIDO FERNANDEZ, JUAN MANUEL (Co-tutorships)
Omil Prieto, Francisco (Tutorships)
GARRIDO FERNANDEZ, JUAN MANUEL (Co-tutorships)
Court
HOSPIDO QUINTANA, ALMUDENA (Chairman)
FERNANDEZ ESCRIBANO, JOSE ANGEL (Secretary)
ALDREY VAZQUEZ, JOSE ANTONIO (Member)
HOSPIDO QUINTANA, ALMUDENA (Chairman)
FERNANDEZ ESCRIBANO, JOSE ANGEL (Secretary)
ALDREY VAZQUEZ, JOSE ANTONIO (Member)
Sequential valorization of apple bagasse: extraction of bioactive compounds and production of biofuels by ABE fermentation.
Authorship
A.B.M.
Master in Chemical and Bioprocess Engineering
A.B.M.
Master in Chemical and Bioprocess Engineering
Defense date
02.15.2024 10:00
02.15.2024 10:00
Summary
In this master's degree final project, the feasibility of a sequential recovery of a cider industry waste such as apple pomace has been studied. The valorization goes through a drying stage, a bioactive compound extraction stage, an enzymatic hydrolysis stage and a biofuel production stage by ABE fermentation. Firstly, the apple pomace was dried in ovens at 50ºC for periods of two, four and seven days, reaching humidity percentages of 21.51%, 18.18% and 7.68% respectively. Each pomace fraction presented different characteristics which were evaluated in the next stage. Subsequently, the extraction was carried out using a water-ethanol mixture as a solvent, obtaining extracts with concentrations of 32.02 mg GAE/g DW of total phenolic compounds, 6.67 mg TE/g DW of antioxidant activity by the DPPH method. , 15.55 mg TE/g DW of antioxidant activity by the FRAP method, 39.10 mg TE/g DW of antioxidant activity by the ABTS method and 6.34 mg RE/g DW of total flavonoids. It was found that the extraction stage entails an associated loss of part of the constituent sugars of the apple pomace ,which limits the subsequent stages of valuation. In the next stage, enzymatic hydrolysis, the parameters that most affect the operation were studied, reaching the conclusion that strict pH control is necessary and having a good geometry and heating system in the reaction vessel in order to release of sugars is carried out correctly. It was also confirmed that when apple pomace from the extraction stage is used, the enzymatic hydrolysis only reaches a concentration of total reducing sugars up to 63.51 g/L. Finally, the last ABE fermentation stage was evaluated in control medium (P2) formed by glucose and apple hydrolyzate (HBM) both from buffered hydrolysates as aqueous using C. beijerinckii as inoculum. It was found that the media where the greatest production of metabolites occurs in the synthetic medium composed only of glucosa while buffered HBM medium inhibits bacterial growth and aqueous HBM medium allows the biological reaction to be carried out, but obtaining lower yields and productivities than the P2 medium.
In this master's degree final project, the feasibility of a sequential recovery of a cider industry waste such as apple pomace has been studied. The valorization goes through a drying stage, a bioactive compound extraction stage, an enzymatic hydrolysis stage and a biofuel production stage by ABE fermentation. Firstly, the apple pomace was dried in ovens at 50ºC for periods of two, four and seven days, reaching humidity percentages of 21.51%, 18.18% and 7.68% respectively. Each pomace fraction presented different characteristics which were evaluated in the next stage. Subsequently, the extraction was carried out using a water-ethanol mixture as a solvent, obtaining extracts with concentrations of 32.02 mg GAE/g DW of total phenolic compounds, 6.67 mg TE/g DW of antioxidant activity by the DPPH method. , 15.55 mg TE/g DW of antioxidant activity by the FRAP method, 39.10 mg TE/g DW of antioxidant activity by the ABTS method and 6.34 mg RE/g DW of total flavonoids. It was found that the extraction stage entails an associated loss of part of the constituent sugars of the apple pomace ,which limits the subsequent stages of valuation. In the next stage, enzymatic hydrolysis, the parameters that most affect the operation were studied, reaching the conclusion that strict pH control is necessary and having a good geometry and heating system in the reaction vessel in order to release of sugars is carried out correctly. It was also confirmed that when apple pomace from the extraction stage is used, the enzymatic hydrolysis only reaches a concentration of total reducing sugars up to 63.51 g/L. Finally, the last ABE fermentation stage was evaluated in control medium (P2) formed by glucose and apple hydrolyzate (HBM) both from buffered hydrolysates as aqueous using C. beijerinckii as inoculum. It was found that the media where the greatest production of metabolites occurs in the synthetic medium composed only of glucosa while buffered HBM medium inhibits bacterial growth and aqueous HBM medium allows the biological reaction to be carried out, but obtaining lower yields and productivities than the P2 medium.
Direction
EIBES GONZALEZ, GEMMA MARIA (Tutorships)
LU CHAU, THELMO ALEJANDRO (Co-tutorships)
FARIÑAS MERA, RAQUEL (Co-tutorships)
EIBES GONZALEZ, GEMMA MARIA (Tutorships)
LU CHAU, THELMO ALEJANDRO (Co-tutorships)
FARIÑAS MERA, RAQUEL (Co-tutorships)
Court
ROCA BORDELLO, ENRIQUE (Chairman)
VAL DEL RIO, MARIA ANGELES (Secretary)
RODRIGUEZ OSORIO, CARLOS (Member)
ROCA BORDELLO, ENRIQUE (Chairman)
VAL DEL RIO, MARIA ANGELES (Secretary)
RODRIGUEZ OSORIO, CARLOS (Member)
Characterization of Emerging Contaminants of Concern during Wastewater Treatment from Individual Treatment Systems
Authorship
A.S.S.
Master in Chemical and Bioprocess Engineering
A.S.S.
Master in Chemical and Bioprocess Engineering
Defense date
07.09.2024 13:15
07.09.2024 13:15
Summary
In recent years, there has been a growing interest in studying a new group of contaminants present in wastewater, known as emerging contaminants of concern (CECs). Among CECs are organic micropollutants (OMPs), which are detected in concentrations ranging from microg/L to ng/L. Environmental analyses indicate that these pollutants are present in most surface waters. Among the many micropollutants, antibiotics stand out due to their relevance, as they can contribute to the emergence of antibiotic-resistant microorganisms (ARMs). Currently, the number of deaths from antibiotic resistance is 700,000 people worldwide, a figure that could rise to 10 million by 2050. An effective way to combat this problem is through the treatment of water in WWTPs, however the population in rural areas is generally not connected to WWTPs. In these areas, water is treated with individual treatment systems such as septic tanks or storage tanks, which generally involves low-level purification. Information on this type of water at the level of both classic contaminants and CECs is scarce or non-existent. This work characterizes the CECs present in the waters of individual treatment systems and studies their removal in an activated sludge reactor.
In recent years, there has been a growing interest in studying a new group of contaminants present in wastewater, known as emerging contaminants of concern (CECs). Among CECs are organic micropollutants (OMPs), which are detected in concentrations ranging from microg/L to ng/L. Environmental analyses indicate that these pollutants are present in most surface waters. Among the many micropollutants, antibiotics stand out due to their relevance, as they can contribute to the emergence of antibiotic-resistant microorganisms (ARMs). Currently, the number of deaths from antibiotic resistance is 700,000 people worldwide, a figure that could rise to 10 million by 2050. An effective way to combat this problem is through the treatment of water in WWTPs, however the population in rural areas is generally not connected to WWTPs. In these areas, water is treated with individual treatment systems such as septic tanks or storage tanks, which generally involves low-level purification. Information on this type of water at the level of both classic contaminants and CECs is scarce or non-existent. This work characterizes the CECs present in the waters of individual treatment systems and studies their removal in an activated sludge reactor.
Direction
GARRIDO FERNANDEZ, JUAN MANUEL (Tutorships)
GARRIDO FERNANDEZ, JUAN MANUEL (Tutorships)
Court
CASARES LONG, JUAN JOSE (Chairman)
EIBES GONZALEZ, GEMMA MARIA (Secretary)
FRANCO URIA, MARIA AMAYA (Member)
CASARES LONG, JUAN JOSE (Chairman)
EIBES GONZALEZ, GEMMA MARIA (Secretary)
FRANCO URIA, MARIA AMAYA (Member)
Deterpenation of essential citrus oils by extraction with a biocompatible ionic liquid
Authorship
A.R.G.
Master in Chemical and Bioprocess Engineering
A.R.G.
Master in Chemical and Bioprocess Engineering
Defense date
09.11.2024 12:30
09.11.2024 12:30
Summary
Citrus essential oils are widely used in the food industry as preservatives, taste-enhancers and flavorings, among others. These oils are mostly made up of terpenes, which do not particularly contribute to the desired organoleptic properties and can also cause problems associated with the stability of the essential oil. One possibility for reducing the volumes to be handled and for improving the properties of these oils is to subject them to a deterpenation process. One of the most commonly used techniques for this is liquid-liquid extraction. However, the conventional solvents used in deterpenation have a low thermodynamic performance and some incompatibility with essential oil applications. In this context, there is an increase in research for the substitution of these solvents in the deterpenation of citrus essential oils by other alternatives that are more favorable. In this regard, different ionic liquids have been studied, since they generally present good properties to be considered as alternative solvents in industrial applications. However, although some of the ionic liquids tested to date show very interesting behaviours from a thermodynamic perspective as solvents for deterpenation, their toxicity or biodegradability profile reduces the feasibility of their use in the corresponding process on an industrial scale. Thus, in this Master Thesis the use, as an alternative, of an ionic liquid with a more biocompatible profile is proposed. Specifically, (2-hydroxyethyl)trimethylammonium 2-hydroxypropanoate (also known as choline lactate) has been considered. The ionic liquid was synthesised and the liquid-liquid equilibrium was determined for the ternary model limonene + linalool + ionic liquid (modelling the essential oil as a binary mixture of limonene and linalool), at 25 C and atmospheric pressure. Based on the equilibrium data obtained, the thermodynamic feasibility of this ionic liquid for the deterpenation of citrus essential oils by liquid-liquid extraction was analysed. Although the solute distribution ratio and selectivity results are slightly lower compared to other ionic liquids previously explored in the literature, the more sustainable credentials of choline lactate may set it up as a more suitable solvent for deterpenation at industrial scale. The liquid-liquid equilibrium data were acceptably correlated by means of classical thermodynamic models, allowing their use in the Aspen Plus chemical process simulator for optimisation of the extraction process. Optimum values for a deterpenation column operating in counter-current corresponded to a solvent/feed ratio of about 2.8 and a total of 8 theoretical stages.
Citrus essential oils are widely used in the food industry as preservatives, taste-enhancers and flavorings, among others. These oils are mostly made up of terpenes, which do not particularly contribute to the desired organoleptic properties and can also cause problems associated with the stability of the essential oil. One possibility for reducing the volumes to be handled and for improving the properties of these oils is to subject them to a deterpenation process. One of the most commonly used techniques for this is liquid-liquid extraction. However, the conventional solvents used in deterpenation have a low thermodynamic performance and some incompatibility with essential oil applications. In this context, there is an increase in research for the substitution of these solvents in the deterpenation of citrus essential oils by other alternatives that are more favorable. In this regard, different ionic liquids have been studied, since they generally present good properties to be considered as alternative solvents in industrial applications. However, although some of the ionic liquids tested to date show very interesting behaviours from a thermodynamic perspective as solvents for deterpenation, their toxicity or biodegradability profile reduces the feasibility of their use in the corresponding process on an industrial scale. Thus, in this Master Thesis the use, as an alternative, of an ionic liquid with a more biocompatible profile is proposed. Specifically, (2-hydroxyethyl)trimethylammonium 2-hydroxypropanoate (also known as choline lactate) has been considered. The ionic liquid was synthesised and the liquid-liquid equilibrium was determined for the ternary model limonene + linalool + ionic liquid (modelling the essential oil as a binary mixture of limonene and linalool), at 25 C and atmospheric pressure. Based on the equilibrium data obtained, the thermodynamic feasibility of this ionic liquid for the deterpenation of citrus essential oils by liquid-liquid extraction was analysed. Although the solute distribution ratio and selectivity results are slightly lower compared to other ionic liquids previously explored in the literature, the more sustainable credentials of choline lactate may set it up as a more suitable solvent for deterpenation at industrial scale. The liquid-liquid equilibrium data were acceptably correlated by means of classical thermodynamic models, allowing their use in the Aspen Plus chemical process simulator for optimisation of the extraction process. Optimum values for a deterpenation column operating in counter-current corresponded to a solvent/feed ratio of about 2.8 and a total of 8 theoretical stages.
Direction
RODRIGUEZ MARTINEZ, HECTOR (Tutorships)
PENA PUGA, CARLOS ALBERTO (Co-tutorships)
RODRIGUEZ MARTINEZ, HECTOR (Tutorships)
PENA PUGA, CARLOS ALBERTO (Co-tutorships)
Court
CASARES LONG, JUAN JOSE (Chairman)
EIBES GONZALEZ, GEMMA MARIA (Secretary)
FRANCO URIA, MARIA AMAYA (Member)
CASARES LONG, JUAN JOSE (Chairman)
EIBES GONZALEZ, GEMMA MARIA (Secretary)
FRANCO URIA, MARIA AMAYA (Member)
Optimization of tannins as coadjutants for finished beer filtration.
Authorship
R.G.F.S.
Master in Chemical and Bioprocess Engineering
R.G.F.S.
Master in Chemical and Bioprocess Engineering
Defense date
02.15.2024 11:00
02.15.2024 11:00
Summary
The clarification of beer through filtration is a crucial phase to eliminate particles and sediments, ensuring a product with good colloidal stability, a long shelf life, brightness, and organoleptically correct to meet consumer expectations. Traditionally at Estrella Galicia, enzymes and silica gel have been used as maturation and filtration aids. Turbidity in beer refers to the presence of suspended particles that affect its transparency; a significant portion of these particles is due to the protein-polyphenol complex and their interactions. The goal is to remove part of the turbidity equation while preserving proteins and polyphenols that contribute to the beer's sensory profile. However, in the constant pursuit of optimizing the efficiency and sustainability of the brewing process, there is a need to research and test commercially available alternatives to positively impact the environment, improve costs, quality, and operations. Tannins, polyphenolic compounds found in various natural sources, have demonstrated clarifying and stabilizing properties in other food industries. The modification of the aid cannot be analyzed in isolation but as part of a complex process affected by various variables. Filtration efficiency, beer stability, organoleptic characteristics, and cost will be key factors to understand if the modification is an improvement and can be definitively implemented.
The clarification of beer through filtration is a crucial phase to eliminate particles and sediments, ensuring a product with good colloidal stability, a long shelf life, brightness, and organoleptically correct to meet consumer expectations. Traditionally at Estrella Galicia, enzymes and silica gel have been used as maturation and filtration aids. Turbidity in beer refers to the presence of suspended particles that affect its transparency; a significant portion of these particles is due to the protein-polyphenol complex and their interactions. The goal is to remove part of the turbidity equation while preserving proteins and polyphenols that contribute to the beer's sensory profile. However, in the constant pursuit of optimizing the efficiency and sustainability of the brewing process, there is a need to research and test commercially available alternatives to positively impact the environment, improve costs, quality, and operations. Tannins, polyphenolic compounds found in various natural sources, have demonstrated clarifying and stabilizing properties in other food industries. The modification of the aid cannot be analyzed in isolation but as part of a complex process affected by various variables. Filtration efficiency, beer stability, organoleptic characteristics, and cost will be key factors to understand if the modification is an improvement and can be definitively implemented.
Direction
RODRIGUEZ MARTINEZ, HECTOR (Tutorships)
Vázquez Pedrouzo, Diego (Co-tutorships)
RODRIGUEZ MARTINEZ, HECTOR (Tutorships)
Vázquez Pedrouzo, Diego (Co-tutorships)
Court
ROCA BORDELLO, ENRIQUE (Chairman)
VAL DEL RIO, MARIA ANGELES (Secretary)
RODRIGUEZ OSORIO, CARLOS (Member)
ROCA BORDELLO, ENRIQUE (Chairman)
VAL DEL RIO, MARIA ANGELES (Secretary)
RODRIGUEZ OSORIO, CARLOS (Member)
The effect of land use change on soil carbon content and forms
Authorship
A.M.M.S.
Master in Environmental Engineering (3rd ed)
A.M.M.S.
Master in Environmental Engineering (3rd ed)
Defense date
06.25.2024 10:00
06.25.2024 10:00
Summary
Soil is known to be the most important storage of carbon over the Earth’s surface. Variation on land use and/or management may have an impact on soil-organic-carbon-content, which may produce an increase in greenhouse gases concentrations in the atmosphere. This study, within the ReCrop project (Bioinocula and CROPping systems: an integrated biotechnological approach for improving crop yield, biodiversity and REsilience of Mediterranean agro-ecosystems), aims to determine the effects of the land use change on organic carbon fractions. Soil samples from 0- 10 cm and 10- 20 cm soil layers were extracted from a plot located in the municipality of Trabada, the same year the change from an over-80-year-old-meadow to maize (Zea mais L.) cropland took place and 28 months later. Several extractions were used in order to achieve the aim of the present study (hot-water extractable carbon, potassium permanganate oxidizable carbon, sodium pyrophosphate soluble carbon and potassium sulphate soluble carbon) as well as a physical fractionation of the soil organic carbon. 5 These results demonstrate the effect produced by the change on land management from a meadow to maize cropland and how the different organic carbon fractions respond to this variation. The more relevant consequences were produced by plowing, which promotes carbon mineralization as well as homogenization between the studied soil layers.
Soil is known to be the most important storage of carbon over the Earth’s surface. Variation on land use and/or management may have an impact on soil-organic-carbon-content, which may produce an increase in greenhouse gases concentrations in the atmosphere. This study, within the ReCrop project (Bioinocula and CROPping systems: an integrated biotechnological approach for improving crop yield, biodiversity and REsilience of Mediterranean agro-ecosystems), aims to determine the effects of the land use change on organic carbon fractions. Soil samples from 0- 10 cm and 10- 20 cm soil layers were extracted from a plot located in the municipality of Trabada, the same year the change from an over-80-year-old-meadow to maize (Zea mais L.) cropland took place and 28 months later. Several extractions were used in order to achieve the aim of the present study (hot-water extractable carbon, potassium permanganate oxidizable carbon, sodium pyrophosphate soluble carbon and potassium sulphate soluble carbon) as well as a physical fractionation of the soil organic carbon. 5 These results demonstrate the effect produced by the change on land management from a meadow to maize cropland and how the different organic carbon fractions respond to this variation. The more relevant consequences were produced by plowing, which promotes carbon mineralization as well as homogenization between the studied soil layers.
Direction
MONTERROSO MARTINEZ, MARIA DEL CARMEN (Tutorships)
Prieto Fernández, María Ángeles (Co-tutorships)
Trasar Cepeda, Mª del Carmen (Co-tutorships)
MONTERROSO MARTINEZ, MARIA DEL CARMEN (Tutorships)
Prieto Fernández, María Ángeles (Co-tutorships)
Trasar Cepeda, Mª del Carmen (Co-tutorships)
Court
BARRAL SILVA, MARIA TERESA DEL CARMEN (Chairman)
VAL DEL RIO, MARIA ANGELES (Secretary)
PEÑA VAZQUEZ, ELENA MARIA (Member)
BARRAL SILVA, MARIA TERESA DEL CARMEN (Chairman)
VAL DEL RIO, MARIA ANGELES (Secretary)
PEÑA VAZQUEZ, ELENA MARIA (Member)
Rheological and stability assessment of olive oil/water emulsions with chitosan for oleogel production
Authorship
D.R.N.
Master in Chemical and Bioprocess Engineering
D.R.N.
Master in Chemical and Bioprocess Engineering
Defense date
02.15.2024 13:00
02.15.2024 13:00
Summary
The substitution of animal-origin fats with healthier plant-origin fats is an objective that the food industry has been involved in and working on for the past few years. To achieve this, it is necessary to transform vegetable oils, which are typically in a liquid state, into fats that possess the texture and organoleptic properties of the animal fats they aim to replace. Recently, research has focused on the development of new products called oleogels, suitable for this purpose. An oleogel is essentially a vegetable oil trapped in a three-dimensional network, often using a polymer such as chitosan. There are two main methods for producing these oleogels: direct and indirect. Among the variations of the indirect method, one of the most employed and studied involves using an emulsion as an intermediate step (emulsion-template), which is then dried or lyophilized to obtain the final product. This Master's Thesis primarily aims to study the rheological properties and analyze the stability of olive oil/water emulsions with chitosan and vanillin for oleogel production. Various operational variables were studied to determine the most influential and critical ones for stability. The studied variables include agitation speed (power) (6,500, 9,500, 13,500, and 17,500 min-1), homogenization time (0.5, 2, 4, and 10 min), reaction temperature between vanillin and chitosan (25, 40, and 55C), chitosan concentration (0.6, 0.7, 0.8, 0.9, and 1.0), the ratio of vanillin molecules to chitosan monomers in the final emulsion (0.0, 0.3, 0.7, and 1.3), and the addition of emulsifying agents (Tween 20 and 60). These emulsions underwent rheological tests, characterized through rotational tests (time and shear rate sweeps) and oscillatory tests (deformation and frequency sweeps), as well as temperature ramps allowing accelerated stability analysis. The viscous (G´´) and elastic (G´) moduli determined through frequency sweeps satisfactorily fit the power-law model, and the complex viscosity could be simulated with the Cross-Williamson model. Additionally, the viscous component was always greater than the elastic one in the studied systems, following the literature trend for emulsions with a similar volumetric fraction of the dispersed phase (0.52). The study of shear rate sweeps determined that the apparent viscosity of the emulsions is not time-dependent. Flow curves conformed to the Herschel-Bulkley model, characterizing the emulsions as pseudoplastic. Comparing rotational and oscillatory tests, it was observed that, for all studied emulsions, except those with emulsifying agents, the Cox-Merz rule held for high shear and angular velocities (1 s-1 or rad/s). The comparison between homogenization stage variables showed that time and agitation speed did not significantly affect emulsion stability, whereas reaction temperature, chitosan concentration, vanillin/chitosan ratio, and the addition of emulsifying agents did. Finally, the proposed operating protocol consists of homogenization for 4 minutes at 9,500 min-1, with a 0.9 w/w concentration in the final emulsion, a vanillin to chitosan molecule ratio of 0.7, a reaction temperature of 55C, and the use of Tween 20 as an emulsifying agent.
The substitution of animal-origin fats with healthier plant-origin fats is an objective that the food industry has been involved in and working on for the past few years. To achieve this, it is necessary to transform vegetable oils, which are typically in a liquid state, into fats that possess the texture and organoleptic properties of the animal fats they aim to replace. Recently, research has focused on the development of new products called oleogels, suitable for this purpose. An oleogel is essentially a vegetable oil trapped in a three-dimensional network, often using a polymer such as chitosan. There are two main methods for producing these oleogels: direct and indirect. Among the variations of the indirect method, one of the most employed and studied involves using an emulsion as an intermediate step (emulsion-template), which is then dried or lyophilized to obtain the final product. This Master's Thesis primarily aims to study the rheological properties and analyze the stability of olive oil/water emulsions with chitosan and vanillin for oleogel production. Various operational variables were studied to determine the most influential and critical ones for stability. The studied variables include agitation speed (power) (6,500, 9,500, 13,500, and 17,500 min-1), homogenization time (0.5, 2, 4, and 10 min), reaction temperature between vanillin and chitosan (25, 40, and 55C), chitosan concentration (0.6, 0.7, 0.8, 0.9, and 1.0), the ratio of vanillin molecules to chitosan monomers in the final emulsion (0.0, 0.3, 0.7, and 1.3), and the addition of emulsifying agents (Tween 20 and 60). These emulsions underwent rheological tests, characterized through rotational tests (time and shear rate sweeps) and oscillatory tests (deformation and frequency sweeps), as well as temperature ramps allowing accelerated stability analysis. The viscous (G´´) and elastic (G´) moduli determined through frequency sweeps satisfactorily fit the power-law model, and the complex viscosity could be simulated with the Cross-Williamson model. Additionally, the viscous component was always greater than the elastic one in the studied systems, following the literature trend for emulsions with a similar volumetric fraction of the dispersed phase (0.52). The study of shear rate sweeps determined that the apparent viscosity of the emulsions is not time-dependent. Flow curves conformed to the Herschel-Bulkley model, characterizing the emulsions as pseudoplastic. Comparing rotational and oscillatory tests, it was observed that, for all studied emulsions, except those with emulsifying agents, the Cox-Merz rule held for high shear and angular velocities (1 s-1 or rad/s). The comparison between homogenization stage variables showed that time and agitation speed did not significantly affect emulsion stability, whereas reaction temperature, chitosan concentration, vanillin/chitosan ratio, and the addition of emulsifying agents did. Finally, the proposed operating protocol consists of homogenization for 4 minutes at 9,500 min-1, with a 0.9 w/w concentration in the final emulsion, a vanillin to chitosan molecule ratio of 0.7, a reaction temperature of 55C, and the use of Tween 20 as an emulsifying agent.
Direction
MOREIRA MARTINEZ, RAMON FELIPE (Tutorships)
FRANCO RUIZ, DANIEL JOSE (Co-tutorships)
MOREIRA MARTINEZ, RAMON FELIPE (Tutorships)
FRANCO RUIZ, DANIEL JOSE (Co-tutorships)
Court
VIDAL TATO, MARIA ISABEL (Chairman)
ROMERO CASTRO, NOELIA MARIA (Secretary)
RODRIGUEZ MARTINEZ, HECTOR (Member)
VIDAL TATO, MARIA ISABEL (Chairman)
ROMERO CASTRO, NOELIA MARIA (Secretary)
RODRIGUEZ MARTINEZ, HECTOR (Member)
Valorization of lignocellulosic materials as binders for the generation of sustainable bituminous emulsions.
Authorship
J.L.A.S.
Master in Chemical and Bioprocess Engineering
J.L.A.S.
Master in Chemical and Bioprocess Engineering
Defense date
02.15.2024 09:25
02.15.2024 09:25
Summary
This document addresses the development of sustainable alternatives using technical Kraft lignin from the pulp and paper industry for the total or partial substitution of bitumen emulsion binders. To this end, three different lignin modification routes have been proposed and experimental designs have been developed to optimize the operating conditions of these routes. Subsequently, the characterization of the modified lignins was carried out through the determination of their hydroxyl index and through FTIR, H-NMR and 13C-NMR spectroscopy analysis. Finally, a brief economic analysis of the additives studied was carried out.
This document addresses the development of sustainable alternatives using technical Kraft lignin from the pulp and paper industry for the total or partial substitution of bitumen emulsion binders. To this end, three different lignin modification routes have been proposed and experimental designs have been developed to optimize the operating conditions of these routes. Subsequently, the characterization of the modified lignins was carried out through the determination of their hydroxyl index and through FTIR, H-NMR and 13C-NMR spectroscopy analysis. Finally, a brief economic analysis of the additives studied was carried out.
Direction
GONZALEZ ALVAREZ, JULIA (Tutorships)
FREIRE LEIRA, MARIA SONIA (Co-tutorships)
Castro López, María Mar (Co-tutorships)
GONZALEZ ALVAREZ, JULIA (Tutorships)
FREIRE LEIRA, MARIA SONIA (Co-tutorships)
Castro López, María Mar (Co-tutorships)
Court
ROCA BORDELLO, ENRIQUE (Chairman)
VAL DEL RIO, MARIA ANGELES (Secretary)
RODRIGUEZ OSORIO, CARLOS (Member)
ROCA BORDELLO, ENRIQUE (Chairman)
VAL DEL RIO, MARIA ANGELES (Secretary)
RODRIGUEZ OSORIO, CARLOS (Member)
Design and implementation of a modern data architecture based on the Microsoft Fabric platform
Authorship
S.B.M.
Master in Masive Data Analisys Tecnologies: Big Data
S.B.M.
Master in Masive Data Analisys Tecnologies: Big Data
Defense date
07.15.2024 18:00
07.15.2024 18:00
Summary
The ability to analyze with the increasing growth of data has become essential for companies seeking to make quick decisions based on precise data. Nowadays, traditional data analysis technologies struggle to process the enormous volumes of data created, making more scalable and effective solutions necessary. This Master's Thesis focuses on the creation and implementation of a modern data architecture that uses Microsoft Fabric. In this project, a detailed investigation of the features and functionalities of Microsoft Fabric will be conducted, as well as its structure and main components. The project is structured in several stages, starting with the theoretical understanding of Microsoft Fabric and its integration with SDG Group's data strategies. A complete data integration flow is designed and implemented to evaluate Microsoft Fabric's capabilities in terms of scalability, performance, ease of use, and cost. Additionally, a comparison is made with other massive data analysis solutions like Google BigQuery, Amazon Redshift, and Snowflake, highlighting Microsoft Fabric's competitive advantages. Finally, conclusions about Microsoft Fabric's potential are presented. The results indicate that Microsoft Fabric is a valuable tool that can provide a significant competitive advantage through efficient and effective analysis of large volumes of data.
The ability to analyze with the increasing growth of data has become essential for companies seeking to make quick decisions based on precise data. Nowadays, traditional data analysis technologies struggle to process the enormous volumes of data created, making more scalable and effective solutions necessary. This Master's Thesis focuses on the creation and implementation of a modern data architecture that uses Microsoft Fabric. In this project, a detailed investigation of the features and functionalities of Microsoft Fabric will be conducted, as well as its structure and main components. The project is structured in several stages, starting with the theoretical understanding of Microsoft Fabric and its integration with SDG Group's data strategies. A complete data integration flow is designed and implemented to evaluate Microsoft Fabric's capabilities in terms of scalability, performance, ease of use, and cost. Additionally, a comparison is made with other massive data analysis solutions like Google BigQuery, Amazon Redshift, and Snowflake, highlighting Microsoft Fabric's competitive advantages. Finally, conclusions about Microsoft Fabric's potential are presented. The results indicate that Microsoft Fabric is a valuable tool that can provide a significant competitive advantage through efficient and effective analysis of large volumes of data.
Direction
Triñanes Fernández, Joaquín Ángel (Tutorships)
López Chao, Brais (Co-tutorships)
Triñanes Fernández, Joaquín Ángel (Tutorships)
López Chao, Brais (Co-tutorships)
Court
Argüello Pedreira, Francisco Santiago (Chairman)
LAMA PENIN, MANUEL (Secretary)
FELIX LAMAS, PAULO (Member)
Argüello Pedreira, Francisco Santiago (Chairman)
LAMA PENIN, MANUEL (Secretary)
FELIX LAMAS, PAULO (Member)
Analysis on the impact on the explainability of an LLM model through Knowledge Graphs
Authorship
O.F.F.
Master in Masive Data Analisys Tecnologies: Big Data
O.F.F.
Master in Masive Data Analisys Tecnologies: Big Data
Defense date
07.15.2024 18:00
07.15.2024 18:00
Summary
In this memoir, we will present knowledge graphs as a tool to improve the explainability of Large Language Models through the integration of knowledge graphs and LLMs. To test its effectiveness, we will apply it to a use case in medicine: the InnovaTrial project, by the Servizo Galego de Saúde in collaboration with the company NTTData.
In this memoir, we will present knowledge graphs as a tool to improve the explainability of Large Language Models through the integration of knowledge graphs and LLMs. To test its effectiveness, we will apply it to a use case in medicine: the InnovaTrial project, by the Servizo Galego de Saúde in collaboration with the company NTTData.
Direction
FELIX LAMAS, PAULO (Tutorships)
Kreibel , Denis (Co-tutorships)
FELIX LAMAS, PAULO (Tutorships)
Kreibel , Denis (Co-tutorships)
Court
MUCIENTES MOLINA, MANUEL FELIPE (Chairman)
López Martínez, Paula (Secretary)
RIOS VIQUEIRA, JOSE RAMON (Member)
MUCIENTES MOLINA, MANUEL FELIPE (Chairman)
López Martínez, Paula (Secretary)
RIOS VIQUEIRA, JOSE RAMON (Member)
Large-Scale financial data analysis for spending pattern extraction and behavioral predictions
Authorship
M.P.R.
Master in Masive Data Analisys Tecnologies: Big Data
M.P.R.
Master in Masive Data Analisys Tecnologies: Big Data
Defense date
07.16.2024 10:00
07.16.2024 10:00
Summary
The central purpose of this Master's Thesis is to conduct a comprehensive analysis of the behavior of banking institution users through a detailed review of a large number of banking transactions. The primary objective is to understand consumption patterns, anticipate financial trends, and consequently, enhance the personalization of the banking services offered. More specifically, a subset of transactional data of interest has been selected for practical purposes, and various strategies for analyzing, processing, and aggregating user transactions have been explored for the following goals. The first goal is to discover relevant information that allows for the homogeneous grouping of users based on their spending patterns and to identify similarities and relevant characteristics of each group. The second goal is to use supervised learning algorithms to predict whether a user will be inclined to contract a specific service or not.
The central purpose of this Master's Thesis is to conduct a comprehensive analysis of the behavior of banking institution users through a detailed review of a large number of banking transactions. The primary objective is to understand consumption patterns, anticipate financial trends, and consequently, enhance the personalization of the banking services offered. More specifically, a subset of transactional data of interest has been selected for practical purposes, and various strategies for analyzing, processing, and aggregating user transactions have been explored for the following goals. The first goal is to discover relevant information that allows for the homogeneous grouping of users based on their spending patterns and to identify similarities and relevant characteristics of each group. The second goal is to use supervised learning algorithms to predict whether a user will be inclined to contract a specific service or not.
Direction
Sánchez Vila, Eduardo Manuel (Tutorships)
Barba Seara, Óscar (Co-tutorships)
Fortes González, Daniel (Co-tutorships)
Sánchez Vila, Eduardo Manuel (Tutorships)
Barba Seara, Óscar (Co-tutorships)
Fortes González, Daniel (Co-tutorships)
Court
Losada Carril, David Enrique (Chairman)
Blanco Heras, Dora (Secretary)
AMEIJEIRAS ALONSO, JOSE (Member)
Losada Carril, David Enrique (Chairman)
Blanco Heras, Dora (Secretary)
AMEIJEIRAS ALONSO, JOSE (Member)
System for Automated Generation of Dimensions in a Dimensional Model
Authorship
R.F.I.
Master in Masive Data Analisys Tecnologies: Big Data
R.F.I.
Master in Masive Data Analisys Tecnologies: Big Data
Defense date
07.15.2024 17:30
07.15.2024 17:30
Summary
In this master's thesis, a tool is developed to automate the creation of dimensions in a dimensional model. The dimensional model is a database design technique oriented towards querying and analyzing large amounts of data. In this context, a dimension is a table in the database that acts as a category or attribute to describe the data to be analyzed and on which grouping and filtering actions are performed. The project is composed of a web application that facilitates the end-user creation of these types of dimensions and another part that is responsible for maintaining and updating the dimension data. The system for creating dimensions consists of a web application that users can connect to in order to transform information from a source database to a dimensional format in a target database. The system facilitates the connection with both databases and also provides a simple interface to select the data of interest from the source database and configure the different options available when defining a dimension. The requirements for information updating and historization condition the design of the dimensions. Specifically, different types of dimensions are established according to the update techniques to be applied. In this project, a task has been created to insert new data or modify existing ones in the target database depending on changes in the source database. This task can be scheduled to run periodically according to the needs of the users.
In this master's thesis, a tool is developed to automate the creation of dimensions in a dimensional model. The dimensional model is a database design technique oriented towards querying and analyzing large amounts of data. In this context, a dimension is a table in the database that acts as a category or attribute to describe the data to be analyzed and on which grouping and filtering actions are performed. The project is composed of a web application that facilitates the end-user creation of these types of dimensions and another part that is responsible for maintaining and updating the dimension data. The system for creating dimensions consists of a web application that users can connect to in order to transform information from a source database to a dimensional format in a target database. The system facilitates the connection with both databases and also provides a simple interface to select the data of interest from the source database and configure the different options available when defining a dimension. The requirements for information updating and historization condition the design of the dimensions. Specifically, different types of dimensions are established according to the update techniques to be applied. In this project, a task has been created to insert new data or modify existing ones in the target database depending on changes in the source database. This task can be scheduled to run periodically according to the needs of the users.
Direction
Fernández Pena, Anselmo Tomás (Tutorships)
Graña Omil, Ángel (Co-tutorships)
Fernández Pena, Anselmo Tomás (Tutorships)
Graña Omil, Ángel (Co-tutorships)
Court
Argüello Pedreira, Francisco Santiago (Chairman)
LAMA PENIN, MANUEL (Secretary)
FELIX LAMAS, PAULO (Member)
Argüello Pedreira, Francisco Santiago (Chairman)
LAMA PENIN, MANUEL (Secretary)
FELIX LAMAS, PAULO (Member)
Implementation of industrial logistics in waste management facilities
Authorship
A.H.P.
Master in Environmental Engineering (3rd ed)
A.H.P.
Master in Environmental Engineering (3rd ed)
Defense date
06.25.2024 10:30
06.25.2024 10:30
Summary
This work is part of a collaboration with a prominent Galician company in the waste management sector. The main objective is to analyze waste treatment methods from the perspective of industrial production. To do this, an analytical approach specific to industrial production systems will be adopted, similar to that applied in manufacturing processes. This study will focus on evaluating the operability of the company's current waste treatment methods. Through these analyses, the aim is to identify possible necessary adjustments before the company moves its activities to a larger facility. size. This transfer is proposed in order to increase the capacity for receiving and processing waste, responding to the growing demands of the sector.
This work is part of a collaboration with a prominent Galician company in the waste management sector. The main objective is to analyze waste treatment methods from the perspective of industrial production. To do this, an analytical approach specific to industrial production systems will be adopted, similar to that applied in manufacturing processes. This study will focus on evaluating the operability of the company's current waste treatment methods. Through these analyses, the aim is to identify possible necessary adjustments before the company moves its activities to a larger facility. size. This transfer is proposed in order to increase the capacity for receiving and processing waste, responding to the growing demands of the sector.
Direction
BELLO BUGALLO, PASTORA MARIA (Tutorships)
BELLO BUGALLO, PASTORA MARIA (Tutorships)
Court
BARRAL SILVA, MARIA TERESA DEL CARMEN (Chairman)
VAL DEL RIO, MARIA ANGELES (Secretary)
PEÑA VAZQUEZ, ELENA MARIA (Member)
BARRAL SILVA, MARIA TERESA DEL CARMEN (Chairman)
VAL DEL RIO, MARIA ANGELES (Secretary)
PEÑA VAZQUEZ, ELENA MARIA (Member)
Parallel 3D convolutions
Authorship
A.F.L.
Máster Universitario en Computación de Altas Prestaciones / High Performance Computing from the University of A Coruña and the University of Santiago de Compostela
A.F.L.
Máster Universitario en Computación de Altas Prestaciones / High Performance Computing from the University of A Coruña and the University of Santiago de Compostela
Defense date
07.03.2024 10:30
07.03.2024 10:30
Summary
In this Master Thesis, a 3D convolution algorithm is built and optimized to process aerial LiDAR point clouds in order to identify similar objects according to the given input pattern. Specifically, a dataset corresponding to residential areas is used and, in order to evaluate the performance of the algorithm, the identification of buildings is set as a target. Following the construction of the convolution algorithm and experimentation with different point cloud representation structures, optimizations for shared memory parallelization are applied trying to achieve a low memory consumption. After applying the considered techniques, the execution time of the optimized part is reduced by 98\%, using 64 cores in 2 processors with OpenMP.
In this Master Thesis, a 3D convolution algorithm is built and optimized to process aerial LiDAR point clouds in order to identify similar objects according to the given input pattern. Specifically, a dataset corresponding to residential areas is used and, in order to evaluate the performance of the algorithm, the identification of buildings is set as a target. Following the construction of the convolution algorithm and experimentation with different point cloud representation structures, optimizations for shared memory parallelization are applied trying to achieve a low memory consumption. After applying the considered techniques, the execution time of the optimized part is reduced by 98\%, using 64 cores in 2 processors with OpenMP.
Direction
CABALEIRO DOMINGUEZ, JOSE CARLOS (Tutorships)
Esmorís Pena, Alberto Manuel (Co-tutorships)
CABALEIRO DOMINGUEZ, JOSE CARLOS (Tutorships)
Esmorís Pena, Alberto Manuel (Co-tutorships)
Court
GARCIA LOUREIRO, ANTONIO JESUS (Chairman)
QUESADA BARRIUSO, PABLO (Secretary)
Andión Fernández, José Manuel (Member)
GARCIA LOUREIRO, ANTONIO JESUS (Chairman)
QUESADA BARRIUSO, PABLO (Secretary)
Andión Fernández, José Manuel (Member)
Avaliación ambiental e económica dunha estratexia de multivalorización secuencial do bagazo de mazá
Authorship
I.C.G.
Master in Chemical and Bioprocess Engineering
I.C.G.
Master in Chemical and Bioprocess Engineering
Defense date
07.09.2024 11:00
07.09.2024 11:00
Summary
This work focuses on the development of fan innovative cascade multivalorization strategy using apple pomace as raw material. This approach aims to maximize the utilization of available resources and reduce agricultural waste while generating value-added products. First, the cascade multivalorization process is detailed, consisting of three stages: the extraction of polyphenols of enzymes, and ABE fermentation for biobutanol production. An industrial-scale simulation is conducted based on laboratory study results, highlighting that the process is designed to operate in a semi-continuous mode, ensuring efficiency and a consistent production of the desired products. Subsequently, an economic and environmental assessment is carried out to compare the process with other alternatives available in the market. The economic analysis shows that the process is viable, but the polyphenol extraction section has a higher impact, meaning a decrease in their selling price could affect profitability. A sensitivity analysis is conducted to establish a minimum selling price range. The life cycle assessment provides a comprehensive view of the environmental impacts associated with the process. It identifies that the extraction of polyphenols and the production of biobutanol are critical stages in terms of environmental impact, highlighting the use of ethanol and steam as the main contributors. The need to implement more sustainable practices in the management of these resources to mitigate their environmental impact is emphasized. To address these challenges, a sensitivity analysis is conducted considering alternatives for the processes with the greatest impact. This analysis allows evaluating how environmental impacts vary when modifying the selection of key processes.
This work focuses on the development of fan innovative cascade multivalorization strategy using apple pomace as raw material. This approach aims to maximize the utilization of available resources and reduce agricultural waste while generating value-added products. First, the cascade multivalorization process is detailed, consisting of three stages: the extraction of polyphenols of enzymes, and ABE fermentation for biobutanol production. An industrial-scale simulation is conducted based on laboratory study results, highlighting that the process is designed to operate in a semi-continuous mode, ensuring efficiency and a consistent production of the desired products. Subsequently, an economic and environmental assessment is carried out to compare the process with other alternatives available in the market. The economic analysis shows that the process is viable, but the polyphenol extraction section has a higher impact, meaning a decrease in their selling price could affect profitability. A sensitivity analysis is conducted to establish a minimum selling price range. The life cycle assessment provides a comprehensive view of the environmental impacts associated with the process. It identifies that the extraction of polyphenols and the production of biobutanol are critical stages in terms of environmental impact, highlighting the use of ethanol and steam as the main contributors. The need to implement more sustainable practices in the management of these resources to mitigate their environmental impact is emphasized. To address these challenges, a sensitivity analysis is conducted considering alternatives for the processes with the greatest impact. This analysis allows evaluating how environmental impacts vary when modifying the selection of key processes.
Direction
HOSPIDO QUINTANA, ALMUDENA (Tutorships)
EIBES GONZALEZ, GEMMA MARIA (Co-tutorships)
HOSPIDO QUINTANA, ALMUDENA (Tutorships)
EIBES GONZALEZ, GEMMA MARIA (Co-tutorships)
Court
FEIJOO COSTA, GUMERSINDO (Chairman)
DIAZ JULLIEN, CRISTINA (Secretary)
FERNANDEZ CARRASCO, EUGENIO (Member)
FEIJOO COSTA, GUMERSINDO (Chairman)
DIAZ JULLIEN, CRISTINA (Secretary)
FERNANDEZ CARRASCO, EUGENIO (Member)
Impact of the forest fires on phosphorus bioavailability in sabana environments sujected to recurring fires in Kruger Nacional Park (South Africa)
Authorship
A.V.L.
Master in Environmental Engineering (3rd ed)
A.V.L.
Master in Environmental Engineering (3rd ed)
Defense date
09.09.2024 12:30
09.09.2024 12:30
Summary
Phosphorus is one of the essential elements for plant development; however, due to its chemical reactivity, most of the phosphorus present in the soil is not found in forms bioavailable to plants. On the other hand, some of the bioavailable forms of phosphorus are very soluble so they can be leached and eutrophicate nearby water masses. In this work, the phosphorus content and its main geochemical forms are studied in pre- and post-fire African grassland soils, as well as ashes in the Kruger National Park, South Africa, in systems subjected to prescribed burning for more than two decades. This study aims to: 1) know the content of total P, bioavailable P and geochemical forms of P in soils developed on granite and basalt; 3) Evaluate the effect of forest fires on the bioavailability and geochemical forms of P in a savanna ecosystem. To this end, an analysis was carried out on soils and ashes of the total P content and its most relevant geochemical forms at an environmental level, such as: adsorbed P, P associated with Fe oxides and hydroxides, P associated with aluminum and clays, P organic, residual or recalcitrant P. The results show great heterogeneity between plots in terms of total P content, ranging between 1200 mg kg-1 in the superficial part of soils developed on basalt and 70 mg kg-1 in soils developed on granite. The highest availability of P (P mehlich) occurred in basalt soils, observing that the burned plots had a higher content of bioavailable P. Regarding the geochemical forms of phosphorus, the soils developed on basalt presented apatite P as dominant fractions, followed by P associated with recalcitrant organic matter and P associated with clays and Al hydroxides; while in the soils developed on granite the dominant fraction was P associated with recalcitrant organic matter followed by P associated with humic acids. In the burned soils, the fractionation was similar, observing a slight reduction in the values of organic phosphorus in the post-fire plots.
Phosphorus is one of the essential elements for plant development; however, due to its chemical reactivity, most of the phosphorus present in the soil is not found in forms bioavailable to plants. On the other hand, some of the bioavailable forms of phosphorus are very soluble so they can be leached and eutrophicate nearby water masses. In this work, the phosphorus content and its main geochemical forms are studied in pre- and post-fire African grassland soils, as well as ashes in the Kruger National Park, South Africa, in systems subjected to prescribed burning for more than two decades. This study aims to: 1) know the content of total P, bioavailable P and geochemical forms of P in soils developed on granite and basalt; 3) Evaluate the effect of forest fires on the bioavailability and geochemical forms of P in a savanna ecosystem. To this end, an analysis was carried out on soils and ashes of the total P content and its most relevant geochemical forms at an environmental level, such as: adsorbed P, P associated with Fe oxides and hydroxides, P associated with aluminum and clays, P organic, residual or recalcitrant P. The results show great heterogeneity between plots in terms of total P content, ranging between 1200 mg kg-1 in the superficial part of soils developed on basalt and 70 mg kg-1 in soils developed on granite. The highest availability of P (P mehlich) occurred in basalt soils, observing that the burned plots had a higher content of bioavailable P. Regarding the geochemical forms of phosphorus, the soils developed on basalt presented apatite P as dominant fractions, followed by P associated with recalcitrant organic matter and P associated with clays and Al hydroxides; while in the soils developed on granite the dominant fraction was P associated with recalcitrant organic matter followed by P associated with humic acids. In the burned soils, the fractionation was similar, observing a slight reduction in the values of organic phosphorus in the post-fire plots.
Direction
OTERO PEREZ, XOSE LOIS (Tutorships)
Santín Nuño, Cristina (Co-tutorships)
OTERO PEREZ, XOSE LOIS (Tutorships)
Santín Nuño, Cristina (Co-tutorships)
Court
MOSQUERA CORRAL, ANUSKA (Chairman)
MONTERROSO MARTINEZ, MARIA DEL CARMEN (Secretary)
BARCIELA ALONSO, Ma CARMEN (Member)
MOSQUERA CORRAL, ANUSKA (Chairman)
MONTERROSO MARTINEZ, MARIA DEL CARMEN (Secretary)
BARCIELA ALONSO, Ma CARMEN (Member)
Pilot scale one-unit system for biopolymer production
Authorship
M.A.L.M.
Master in Chemical and Bioprocess Engineering
M.A.L.M.
Master in Chemical and Bioprocess Engineering
Defense date
02.15.2024 12:00
02.15.2024 12:00
Summary
Worldwide waste generation has been increasing in direct proportion to the growth of the human population and the contemporary consumerism model. Among waste, there are oils and fats derived from the food industry in accordance with the population growth that requires meeting its basic needs. Fatty waste, by its nature, sets significant challenges in its management, as it is prone to cause environmental, social, and structural issues, such as water pollution, the production of odours, and the obstruction of pipelines. Among the industries in this sector that generate this type of residue, the fish canning industry in Galicia stands out. Recently, efforts on waste management have been directed towards the possibility of their valorization, generating new high-value-added products from them. Among these products are biopolymers such as polyhydroxyalkanoates (PHA) and triacylglycerides (TAG), which serve as alternative bioplastics to those derived from petroleum and precursors to biofuels, respectively. The present work involves the application of the PRETENACC technology in a pilot scale sequential batch reactor (SBR) (24 L) that uses residual oil as a carbon source to feed a mixed microbial culture (MMC) enriched from activated sludge from a wastewater treatment plant (WWTP). The system operates in successive cycles of 12 hours, enriching the MMC and accumulating biopolymers through the implementation of a feast/famine regime with the decoupled feeding of the carbon source (at the beginning of the cycle) and the nutrients (6 hours after, at the beginning of the famine). After two operations of 35 and 39 days, respectively, with improvements implemented in the aeration system between them, an accumulation of biopolymers close to 40 % wt. has been achieved. Nevertheless, the system needs to be optimized if targeted accumulation of polyhydroxyalkanoates (PHA) is intended.
Worldwide waste generation has been increasing in direct proportion to the growth of the human population and the contemporary consumerism model. Among waste, there are oils and fats derived from the food industry in accordance with the population growth that requires meeting its basic needs. Fatty waste, by its nature, sets significant challenges in its management, as it is prone to cause environmental, social, and structural issues, such as water pollution, the production of odours, and the obstruction of pipelines. Among the industries in this sector that generate this type of residue, the fish canning industry in Galicia stands out. Recently, efforts on waste management have been directed towards the possibility of their valorization, generating new high-value-added products from them. Among these products are biopolymers such as polyhydroxyalkanoates (PHA) and triacylglycerides (TAG), which serve as alternative bioplastics to those derived from petroleum and precursors to biofuels, respectively. The present work involves the application of the PRETENACC technology in a pilot scale sequential batch reactor (SBR) (24 L) that uses residual oil as a carbon source to feed a mixed microbial culture (MMC) enriched from activated sludge from a wastewater treatment plant (WWTP). The system operates in successive cycles of 12 hours, enriching the MMC and accumulating biopolymers through the implementation of a feast/famine regime with the decoupled feeding of the carbon source (at the beginning of the cycle) and the nutrients (6 hours after, at the beginning of the famine). After two operations of 35 and 39 days, respectively, with improvements implemented in the aeration system between them, an accumulation of biopolymers close to 40 % wt. has been achieved. Nevertheless, the system needs to be optimized if targeted accumulation of polyhydroxyalkanoates (PHA) is intended.
Direction
VAL DEL RIO, MARIA ANGELES (Tutorships)
MOSQUERA CORRAL, ANUSKA (Co-tutorships)
Pedrouso Fuentes, Alba (Co-tutorships)
VAL DEL RIO, MARIA ANGELES (Tutorships)
MOSQUERA CORRAL, ANUSKA (Co-tutorships)
Pedrouso Fuentes, Alba (Co-tutorships)
Court
VIDAL TATO, MARIA ISABEL (Chairman)
ROMERO CASTRO, NOELIA MARIA (Secretary)
RODRIGUEZ MARTINEZ, HECTOR (Member)
VIDAL TATO, MARIA ISABEL (Chairman)
ROMERO CASTRO, NOELIA MARIA (Secretary)
RODRIGUEZ MARTINEZ, HECTOR (Member)
Evaluation of materials for dye removal from textile wastewater
Authorship
S.V.R.
Master in Environmental Engineering (3rd ed)
S.V.R.
Master in Environmental Engineering (3rd ed)
Defense date
06.25.2024 12:30
06.25.2024 12:30
Summary
Increasing industrialisation has led to significant water pollution due to the discharge of various pollutants of industrial origin, including synthetic dyes used in the textile industry. One of these environmentally harmful compounds is methylene blue, which is widely used for its intense colour and water solubility, characteristics that are, however, problematic for the natural environment. The removal of methylene blue from wastewater is crucial to protect the environment, ensure public health and promote sustainability. It is therefore vital to develop effective and efficient methods for its removal in order to mitigate its negative effects and promote sustainable management of water resources. Under this premise, in this Master Thesis we have evaluated the effectiveness of different adsorbent materials to remove this compound: compost, algae, Fe(0) particles, biochar (derived from corn and eucalyptus) and pine sawdust. Their adsorption capacity was analysed in batch experiments and in fixed bed columns. The results showed that algae and compost have the best adsorption capacity for methylene blue. Using algae and compost is beneficial because they are abundant and low-cost materials, promote sustainable practices, have high pollutant adsorption capacity and offer additional ecological benefits, such as improved soil quality and carbon dioxide absorption. Moreover, their use fosters a circular economy by reusing bio-waste, which not only reduces wastewater treatment costs, but also minimises the overall environmental impact. These results will allow us to develop a wastewater decontamination process using by-products from other industries, thus promoting more sustainable resource management.
Increasing industrialisation has led to significant water pollution due to the discharge of various pollutants of industrial origin, including synthetic dyes used in the textile industry. One of these environmentally harmful compounds is methylene blue, which is widely used for its intense colour and water solubility, characteristics that are, however, problematic for the natural environment. The removal of methylene blue from wastewater is crucial to protect the environment, ensure public health and promote sustainability. It is therefore vital to develop effective and efficient methods for its removal in order to mitigate its negative effects and promote sustainable management of water resources. Under this premise, in this Master Thesis we have evaluated the effectiveness of different adsorbent materials to remove this compound: compost, algae, Fe(0) particles, biochar (derived from corn and eucalyptus) and pine sawdust. Their adsorption capacity was analysed in batch experiments and in fixed bed columns. The results showed that algae and compost have the best adsorption capacity for methylene blue. Using algae and compost is beneficial because they are abundant and low-cost materials, promote sustainable practices, have high pollutant adsorption capacity and offer additional ecological benefits, such as improved soil quality and carbon dioxide absorption. Moreover, their use fosters a circular economy by reusing bio-waste, which not only reduces wastewater treatment costs, but also minimises the overall environmental impact. These results will allow us to develop a wastewater decontamination process using by-products from other industries, thus promoting more sustainable resource management.
Direction
FIOL LOPEZ, SARAH (Tutorships)
ANTELO MARTINEZ, JUAN (Co-tutorships)
FIOL LOPEZ, SARAH (Tutorships)
ANTELO MARTINEZ, JUAN (Co-tutorships)
Court
HOSPIDO QUINTANA, ALMUDENA (Chairman)
ROMERO CASTRO, NOELIA MARIA (Secretary)
MAURICIO IGLESIAS, MIGUEL (Member)
HOSPIDO QUINTANA, ALMUDENA (Chairman)
ROMERO CASTRO, NOELIA MARIA (Secretary)
MAURICIO IGLESIAS, MIGUEL (Member)
Evaluation of the treatment of saline effluents with high organic load using purple phototrophic bacteria in an SBR reactor.
Authorship
M.B.F.
Master in Environmental Engineering (3rd ed)
M.B.F.
Master in Environmental Engineering (3rd ed)
Defense date
03.01.2024 10:00
03.01.2024 10:00
Summary
Traditionally, wastewater treatment has been focused on the elimination of nutrients (nitrogen, phosphorus and organic matter), in order to ensure that the water meets certain characteristics that minimize the impact on the environment of the discharge. However, in recent years this approach has been completed, conceiving these plants as another actor in the circular economy and from which resources can be obtained. In this sense, purple phototrophic bacteria (hereinafter PPB) have emerged as a promising option in wastewater treatment. Its application is beginning to spread in the treatment of urban effluents. Unlike activated sludge systems, PPB biomass contains a high concentration of single cell protein. This characteristic opens new opportunities for resource recovery, since PPB biomass can be valued as food in livestock and fish farms, in addition to being used as fertilizer. Despite notable advances in the use of PPB in wastewater, hardly any studies have been carried out on the use of cultures enriched in PPB for the treatment of saline effluents. The present work aims to evaluate the effect of salinity on the behavior of a mixed PPB culture.
Traditionally, wastewater treatment has been focused on the elimination of nutrients (nitrogen, phosphorus and organic matter), in order to ensure that the water meets certain characteristics that minimize the impact on the environment of the discharge. However, in recent years this approach has been completed, conceiving these plants as another actor in the circular economy and from which resources can be obtained. In this sense, purple phototrophic bacteria (hereinafter PPB) have emerged as a promising option in wastewater treatment. Its application is beginning to spread in the treatment of urban effluents. Unlike activated sludge systems, PPB biomass contains a high concentration of single cell protein. This characteristic opens new opportunities for resource recovery, since PPB biomass can be valued as food in livestock and fish farms, in addition to being used as fertilizer. Despite notable advances in the use of PPB in wastewater, hardly any studies have been carried out on the use of cultures enriched in PPB for the treatment of saline effluents. The present work aims to evaluate the effect of salinity on the behavior of a mixed PPB culture.
Direction
MOSQUERA CORRAL, ANUSKA (Tutorships)
VAL DEL RIO, MARIA ANGELES (Co-tutorships)
Pedrouso Fuentes, Alba (Co-tutorships)
MOSQUERA CORRAL, ANUSKA (Tutorships)
VAL DEL RIO, MARIA ANGELES (Co-tutorships)
Pedrouso Fuentes, Alba (Co-tutorships)
Court
PARAJO MONTES, MARIA MERCEDES (Chairman)
Balboa Méndez, Sabela (Secretary)
MAURICIO IGLESIAS, MIGUEL (Member)
PARAJO MONTES, MARIA MERCEDES (Chairman)
Balboa Méndez, Sabela (Secretary)
MAURICIO IGLESIAS, MIGUEL (Member)
Enrichment and accumulation in a single unit to obtain biopolymers with oleic acid as substrate
Authorship
S.M.R.
Master in Environmental Engineering (3rd ed)
S.M.R.
Master in Environmental Engineering (3rd ed)
Defense date
06.21.2024 09:30
06.21.2024 09:30
Summary
The management of fatty waste is becoming increasingly significant nowadays. The problems they cause in wastewater treatment plants, along with their contaminating power, make them a target for water treatment research. New strategies are being developed for their use in the synthesis of high-value-added intracellular compounds. Specifically, polyhydroxyalkanoates (PHA), which can be used for the production of bioplastics with better biodegradability and environmental biocompatibility properties. Their synthesis from waste streams using mixed microbial cultures (MMC) and their biodegradable nature make them perfect competitors for replacing conventional petroleum-derived plastics. In this master’s thesis, oleic acid was used as a substrate to understand the ideal operating conditions for enriching an MMC capable of solubilizing lipid streams and accumulating high-value-added biopolymers. For this purpose, a 4 L Sequencing Batch Reactor (SBR) was operated for 123 days, adjusting various operating conditions (pH, nitrogen concentration, and the method of adding the carbon source). Additionally, two batch accumulation tests were performed with the enriched MMC to evaluate the maximum accumulation capacity of this biomass with oleic acid.
The management of fatty waste is becoming increasingly significant nowadays. The problems they cause in wastewater treatment plants, along with their contaminating power, make them a target for water treatment research. New strategies are being developed for their use in the synthesis of high-value-added intracellular compounds. Specifically, polyhydroxyalkanoates (PHA), which can be used for the production of bioplastics with better biodegradability and environmental biocompatibility properties. Their synthesis from waste streams using mixed microbial cultures (MMC) and their biodegradable nature make them perfect competitors for replacing conventional petroleum-derived plastics. In this master’s thesis, oleic acid was used as a substrate to understand the ideal operating conditions for enriching an MMC capable of solubilizing lipid streams and accumulating high-value-added biopolymers. For this purpose, a 4 L Sequencing Batch Reactor (SBR) was operated for 123 days, adjusting various operating conditions (pH, nitrogen concentration, and the method of adding the carbon source). Additionally, two batch accumulation tests were performed with the enriched MMC to evaluate the maximum accumulation capacity of this biomass with oleic acid.
Direction
MOSQUERA CORRAL, ANUSKA (Tutorships)
VAL DEL RIO, MARIA ANGELES (Co-tutorships)
Pedrouso Fuentes, Alba (Co-tutorships)
MOSQUERA CORRAL, ANUSKA (Tutorships)
VAL DEL RIO, MARIA ANGELES (Co-tutorships)
Pedrouso Fuentes, Alba (Co-tutorships)
Court
PRIETO LAMAS, BEATRIZ LORETO (Chairman)
BELLO BUGALLO, PASTORA MARIA (Secretary)
PARADELO NUÑEZ, REMIGIO (Member)
PRIETO LAMAS, BEATRIZ LORETO (Chairman)
BELLO BUGALLO, PASTORA MARIA (Secretary)
PARADELO NUÑEZ, REMIGIO (Member)
Incidence of (Micro)Plastics in Coastal Habitats of the Sálvora Archipelago (Atlantic Islands National Park)
Authorship
S.P.C.
Master in Environmental Engineering (3rd ed)
S.P.C.
Master in Environmental Engineering (3rd ed)
Defense date
03.01.2024 11:00
03.01.2024 11:00
Summary
The oceans accumulate 150 million tons of plastic and it is estimated that they receive between 4.8 and 12.7 million of it per year. Despite this, its impact on the marine environment has yet to be assessed. This work aims to evaluate the incidence of marine litter and microplastics in three coastal habitats (beach, dunes and rocky coast) on the island of Sálvora, National Park of the Atlantic Islands of Galicia. The Sálvora archipelago is located at the mouth of the Arousa estuary, which concentrates most of the mussel production in Galicia, an activity that can contribute substantially to the generation of marine litter. In this work, an inventory of marine litter has been carried out along two coastal transects and the content of microplastics in the sands of Boi beach and in a dune system, in which one of the main colonies is located, has been analyzed. of the island's yellow-legged gull (Larus michahellis). The plastics were identified by FTIR (Fourier-transform infrared spectroscopy) spectroscopy. For the study of microplastics, sand samples were taken from which they were separated by density, with a solution of zinc chloride (ZnCl2). The microparticles obtained were analyzed by Raman spectroscopy and microATR FTIR (micro Attenuated Total Reflection Fourier Transform InfraRed) to, as with macroplastics, determine their composition. The rocky coast presented the greatest accumulation of garbage, representing a mass 10 times greater than that of the beach. The origin of these plastics was related to the sectors that predominate in the area, such as fishing. The analysis of the data has allowed us to estimate the total marine litter present in the Atlantic Islands National Park at almost 4 tons. Regarding microplastics, a higher concentration was observed in the dune area with seagull colonies (2247 MPs/kg), than on the beach (480 MPs/kg). The main forms observed were cellulosic fibers and PET and PVC pellets. In the beach area, fibers are abundant, which could be caused by laundry spills, while in the dune area, pellets predominate, which would be related to the accumulation in bird feces. It is estimated that 1800 106 MPs accumulate on the beach surface of 5100 m2, and 1300 106 MPs accumulate in the 2000 m2 seagull colony.
The oceans accumulate 150 million tons of plastic and it is estimated that they receive between 4.8 and 12.7 million of it per year. Despite this, its impact on the marine environment has yet to be assessed. This work aims to evaluate the incidence of marine litter and microplastics in three coastal habitats (beach, dunes and rocky coast) on the island of Sálvora, National Park of the Atlantic Islands of Galicia. The Sálvora archipelago is located at the mouth of the Arousa estuary, which concentrates most of the mussel production in Galicia, an activity that can contribute substantially to the generation of marine litter. In this work, an inventory of marine litter has been carried out along two coastal transects and the content of microplastics in the sands of Boi beach and in a dune system, in which one of the main colonies is located, has been analyzed. of the island's yellow-legged gull (Larus michahellis). The plastics were identified by FTIR (Fourier-transform infrared spectroscopy) spectroscopy. For the study of microplastics, sand samples were taken from which they were separated by density, with a solution of zinc chloride (ZnCl2). The microparticles obtained were analyzed by Raman spectroscopy and microATR FTIR (micro Attenuated Total Reflection Fourier Transform InfraRed) to, as with macroplastics, determine their composition. The rocky coast presented the greatest accumulation of garbage, representing a mass 10 times greater than that of the beach. The origin of these plastics was related to the sectors that predominate in the area, such as fishing. The analysis of the data has allowed us to estimate the total marine litter present in the Atlantic Islands National Park at almost 4 tons. Regarding microplastics, a higher concentration was observed in the dune area with seagull colonies (2247 MPs/kg), than on the beach (480 MPs/kg). The main forms observed were cellulosic fibers and PET and PVC pellets. In the beach area, fibers are abundant, which could be caused by laundry spills, while in the dune area, pellets predominate, which would be related to the accumulation in bird feces. It is estimated that 1800 106 MPs accumulate on the beach surface of 5100 m2, and 1300 106 MPs accumulate in the 2000 m2 seagull colony.
Direction
OTERO PEREZ, XOSE LOIS (Tutorships)
OTERO PEREZ, XOSE LOIS (Tutorships)
Court
GARCIA-RODEJA GAYOSO, EDUARDO (Chairman)
BARRAL SILVA, MARIA TERESA DEL CARMEN (Secretary)
PARAJO MONTES, MARIA MERCEDES (Member)
GARCIA-RODEJA GAYOSO, EDUARDO (Chairman)
BARRAL SILVA, MARIA TERESA DEL CARMEN (Secretary)
PARAJO MONTES, MARIA MERCEDES (Member)
Modeling of polyphenol inhibition for its integration into an anaerobic digestion control system
Authorship
M.R.G.
Master in Chemical and Bioprocess Engineering
M.R.G.
Master in Chemical and Bioprocess Engineering
Defense date
02.15.2024 13:30
02.15.2024 13:30
Summary
This project, carried out as a Master's Thesis, comprehensively explores the complex interaction between inhibition by phenolic compounds and anaerobic digestion, affecting the microbial process and, consequently, biogas production. A mathematical model incorporating inhibition by polyphenols has been formulated and developed in a patented anaerobic digestion control system from the University of Santiago de Compostela, with exclusive licensing by the company iCODA. This allows for the assessment of anaerobic digestion process performance under variable conditions of phenolic compound presence, with practical applicability and the ability to provide accurate predictions of biogas production. The proposed model has been validated through a batch biogas production test using olive mill wastewater as substrate and literature data, enabling adjustments and calibrations of the kinetic parameters to ensure that simulations accurately reflect the observed behavior in real cases. This project aims to contribute to environmental improvement by offering optimal solutions for waste treatment, positively influencing sustainable development and responsible resource management.
This project, carried out as a Master's Thesis, comprehensively explores the complex interaction between inhibition by phenolic compounds and anaerobic digestion, affecting the microbial process and, consequently, biogas production. A mathematical model incorporating inhibition by polyphenols has been formulated and developed in a patented anaerobic digestion control system from the University of Santiago de Compostela, with exclusive licensing by the company iCODA. This allows for the assessment of anaerobic digestion process performance under variable conditions of phenolic compound presence, with practical applicability and the ability to provide accurate predictions of biogas production. The proposed model has been validated through a batch biogas production test using olive mill wastewater as substrate and literature data, enabling adjustments and calibrations of the kinetic parameters to ensure that simulations accurately reflect the observed behavior in real cases. This project aims to contribute to environmental improvement by offering optimal solutions for waste treatment, positively influencing sustainable development and responsible resource management.
Direction
MAURICIO IGLESIAS, MIGUEL (Tutorships)
RODRIGUEZ VERDE, IVAN (Co-tutorships)
MAURICIO IGLESIAS, MIGUEL (Tutorships)
RODRIGUEZ VERDE, IVAN (Co-tutorships)
Court
VIDAL TATO, MARIA ISABEL (Chairman)
ROMERO CASTRO, NOELIA MARIA (Secretary)
RODRIGUEZ MARTINEZ, HECTOR (Member)
VIDAL TATO, MARIA ISABEL (Chairman)
ROMERO CASTRO, NOELIA MARIA (Secretary)
RODRIGUEZ MARTINEZ, HECTOR (Member)
Integrating Ecosystem Services into LCA of livestock farming: A comparative analysis of beef production systems in Galicia
Authorship
A.F.D.B.
Master in Environmental Engineering (3rd ed)
A.F.D.B.
Master in Environmental Engineering (3rd ed)
Defense date
02.27.2024 12:30
02.27.2024 12:30
Summary
Many livestock systems serve multiple functions by providing ecosystem services (ES) not typically considered in Life Cycle Assessment (LCA) studies. Current LCAs of cattle farming often focus on primary products like beef and milk, neglecting non-provisioning ES. This can lead to a limited understanding of the multifunctional nature of certain management practices. Integrating ES into LCA remains uncommon due to challenges in modelling production systems, gathering inventory data, and interpreting results. This work integrates ES valuations based on Common Agricultural Policy (CAP) payments schemes into LCA mid-point indicators through economic allocation. Recognizing and valuing non-provisioning ES in extensive and mixed systems is crucial in order to acknowledge the labour made by pastoralist and small-scale farmers. The results contribute to the discourse on sustainable livestock farming, informing decision-makers about environmental trade-offs in various management approaches.
Many livestock systems serve multiple functions by providing ecosystem services (ES) not typically considered in Life Cycle Assessment (LCA) studies. Current LCAs of cattle farming often focus on primary products like beef and milk, neglecting non-provisioning ES. This can lead to a limited understanding of the multifunctional nature of certain management practices. Integrating ES into LCA remains uncommon due to challenges in modelling production systems, gathering inventory data, and interpreting results. This work integrates ES valuations based on Common Agricultural Policy (CAP) payments schemes into LCA mid-point indicators through economic allocation. Recognizing and valuing non-provisioning ES in extensive and mixed systems is crucial in order to acknowledge the labour made by pastoralist and small-scale farmers. The results contribute to the discourse on sustainable livestock farming, informing decision-makers about environmental trade-offs in various management approaches.
Direction
HOSPIDO QUINTANA, ALMUDENA (Tutorships)
DIAZ VARELA, EMILIO RAFAEL (Co-tutorships)
HOSPIDO QUINTANA, ALMUDENA (Tutorships)
DIAZ VARELA, EMILIO RAFAEL (Co-tutorships)
Court
MONTERROSO MARTINEZ, MARIA DEL CARMEN (Chairman)
FERNANDEZ ESCRIBANO, JOSE ANGEL (Secretary)
ALDREY VAZQUEZ, JOSE ANTONIO (Member)
MONTERROSO MARTINEZ, MARIA DEL CARMEN (Chairman)
FERNANDEZ ESCRIBANO, JOSE ANGEL (Secretary)
ALDREY VAZQUEZ, JOSE ANTONIO (Member)
Study of Improvements for the Optimization of the Sludge Line in a Wastewater Treatment Plant
Authorship
A.J.S.C.
Master in Chemical and Bioprocess Engineering
A.J.S.C.
Master in Chemical and Bioprocess Engineering
Defense date
02.15.2024 09:30
02.15.2024 09:30
Summary
This TFM will focus on the study of technologies for the improvement of energy recovery, waste management and nutrient recovery in Bens WWTP, a supramunicipal company with capacity to treat urban wastewater of up to 600,000 equivalent inhabitants. The objective would be to make the best use of the resources present in the water to be treated, seeking to maximize energy recovery by minimizing COD oxidation and improving the performance of anaerobic digestion through thermal treatments. Another purpose of the study is to recover the phosphorus removed in the biological reactor by performing controlled crystallization of struvite in the sludge line. In addition, plant maintenance would be reduced due to uncontrolled precipitation in pipes and equipment. In addition, the improvement of waste management is another important aspect of the project, seeking to sanitize the sludge to be able to dispose of it as fertilizer in accordance with current Galician law. Therefore, these measures seek to optimize the operation of the sludge line at the Bens WWTP, seeking greater energy recovery, a reduction in the volume of waste and recovery of the phosphorus removed to be used as fertilizer, thus moving closer to a circular economy model.
This TFM will focus on the study of technologies for the improvement of energy recovery, waste management and nutrient recovery in Bens WWTP, a supramunicipal company with capacity to treat urban wastewater of up to 600,000 equivalent inhabitants. The objective would be to make the best use of the resources present in the water to be treated, seeking to maximize energy recovery by minimizing COD oxidation and improving the performance of anaerobic digestion through thermal treatments. Another purpose of the study is to recover the phosphorus removed in the biological reactor by performing controlled crystallization of struvite in the sludge line. In addition, plant maintenance would be reduced due to uncontrolled precipitation in pipes and equipment. In addition, the improvement of waste management is another important aspect of the project, seeking to sanitize the sludge to be able to dispose of it as fertilizer in accordance with current Galician law. Therefore, these measures seek to optimize the operation of the sludge line at the Bens WWTP, seeking greater energy recovery, a reduction in the volume of waste and recovery of the phosphorus removed to be used as fertilizer, thus moving closer to a circular economy model.
Direction
GARRIDO FERNANDEZ, JUAN MANUEL (Tutorships)
Lamora Suárez, Carlos (Co-tutorships)
GARRIDO FERNANDEZ, JUAN MANUEL (Tutorships)
Lamora Suárez, Carlos (Co-tutorships)
Court
MOSQUERA CORRAL, ANUSKA (Chairman)
RODRIGUEZ FERNANDEZ, MANUEL DAMASO (Secretary)
RODIL RODRIGUEZ, EVA (Member)
MOSQUERA CORRAL, ANUSKA (Chairman)
RODRIGUEZ FERNANDEZ, MANUEL DAMASO (Secretary)
RODIL RODRIGUEZ, EVA (Member)
Evaluation of the continuous partial nitrition process for the efficient removal of the mainstream nitrogen in a WWTP
Authorship
G.M.A.
Master in Environmental Engineering (3rd ed)
G.M.A.
Master in Environmental Engineering (3rd ed)
Defense date
03.01.2024 10:30
03.01.2024 10:30
Summary
The presence of nitrogen in wastewater, if not properly removed, poses a great risk to human health and ecosystems. Conventional nitrogen removal systems are based on the biological processes of nitrification, sequential oxidation of ammonium to nitrite (nitritation by ammonium-oxidizing bacteria, AOB) and from this to nitrate (nitration by nitrite-oxidizing bacteria, NOB) in aerobic conditions, and denitrification, reduction of nitrate or nitrite to nitrogen gas under anoxic conditions. The use of these processes entails high energy expenditure for wastewater treatment plants (WWTP), making wastewater treatment more expensive and increasing the release of greenhouse gases into the atmosphere. The use of systems where the biological processes of partial nitritation and anammox (NP/AMX) are carried out allows energy consumption to be reduced by 60%. This alternative combines NP, which is the partial oxidation of ammonium to nitrite (50% oxidation), with the AMX process in which the remaining ammonium, along with the nitrite produced, is combined to produce nitrogen gas. Although this process has been successfully implemented for the treatment of effluents from anaerobic sludge digesters or industrial wastewater, characterized by their high nitrogen concentration and temperatures in the mesophilic range (30 C), the stable production of nitrite in NP presents a series of difficulties for the treatment of urban wastewater due to its low ammonium concentration (100 mg N-NH4+/L). In the present study, a continuous NP system has been operated with the objective of evaluating the main critical parameters that affect the NP process to treat water with 100 mg N-NH4+/L. The results obtained demonstrate the possibility of oxidizing 50% of ammonium to nitrite in a system operated with hydraulic residence times of 3 h. At the same time, the potential of free ammonia (FA) and free nitrous acid (FNA) to act as NOB inhibitors has been proven. The results obtained reflect the capacity of NOB to develop mechanisms that allow them to tolerate FA concentrations above the theoretical inhibitory limits (0.1 mg FA/L), while they are more sensitive to FNA, beginning to be inhibited below its theoretical limit (0.022 mg FNA/L), which in turn will be related to the concentration of biomass present in the crop. The combination of both parameters has allowed up to almost 90% nitrite accumulation to be obtained, evidencing the capacity of these systems for application in the treatment of urban wastewater.
The presence of nitrogen in wastewater, if not properly removed, poses a great risk to human health and ecosystems. Conventional nitrogen removal systems are based on the biological processes of nitrification, sequential oxidation of ammonium to nitrite (nitritation by ammonium-oxidizing bacteria, AOB) and from this to nitrate (nitration by nitrite-oxidizing bacteria, NOB) in aerobic conditions, and denitrification, reduction of nitrate or nitrite to nitrogen gas under anoxic conditions. The use of these processes entails high energy expenditure for wastewater treatment plants (WWTP), making wastewater treatment more expensive and increasing the release of greenhouse gases into the atmosphere. The use of systems where the biological processes of partial nitritation and anammox (NP/AMX) are carried out allows energy consumption to be reduced by 60%. This alternative combines NP, which is the partial oxidation of ammonium to nitrite (50% oxidation), with the AMX process in which the remaining ammonium, along with the nitrite produced, is combined to produce nitrogen gas. Although this process has been successfully implemented for the treatment of effluents from anaerobic sludge digesters or industrial wastewater, characterized by their high nitrogen concentration and temperatures in the mesophilic range (30 C), the stable production of nitrite in NP presents a series of difficulties for the treatment of urban wastewater due to its low ammonium concentration (100 mg N-NH4+/L). In the present study, a continuous NP system has been operated with the objective of evaluating the main critical parameters that affect the NP process to treat water with 100 mg N-NH4+/L. The results obtained demonstrate the possibility of oxidizing 50% of ammonium to nitrite in a system operated with hydraulic residence times of 3 h. At the same time, the potential of free ammonia (FA) and free nitrous acid (FNA) to act as NOB inhibitors has been proven. The results obtained reflect the capacity of NOB to develop mechanisms that allow them to tolerate FA concentrations above the theoretical inhibitory limits (0.1 mg FA/L), while they are more sensitive to FNA, beginning to be inhibited below its theoretical limit (0.022 mg FNA/L), which in turn will be related to the concentration of biomass present in the crop. The combination of both parameters has allowed up to almost 90% nitrite accumulation to be obtained, evidencing the capacity of these systems for application in the treatment of urban wastewater.
Direction
MOSQUERA CORRAL, ANUSKA (Tutorships)
VAL DEL RIO, MARIA ANGELES (Co-tutorships)
Pedrouso Fuentes, Alba (Co-tutorships)
MOSQUERA CORRAL, ANUSKA (Tutorships)
VAL DEL RIO, MARIA ANGELES (Co-tutorships)
Pedrouso Fuentes, Alba (Co-tutorships)
Court
PARAJO MONTES, MARIA MERCEDES (Chairman)
BARRAL SILVA, MARIA TERESA DEL CARMEN (Secretary)
MAURICIO IGLESIAS, MIGUEL (Member)
PARAJO MONTES, MARIA MERCEDES (Chairman)
BARRAL SILVA, MARIA TERESA DEL CARMEN (Secretary)
MAURICIO IGLESIAS, MIGUEL (Member)
Evaluation of wastewater treatment plants from the perspective of sustainability and circular economy
Authorship
R.P.M.
Master in Environmental Engineering (3rd ed)
R.P.M.
Master in Environmental Engineering (3rd ed)
Defense date
06.21.2024 10:30
06.21.2024 10:30
Summary
The aim of this project is to assess the sustainability and circularity potential of three wastewater treatment plants located in Italy, as well as to identify possible alternative scenarios to enhance their environmental and circular performance. Real data from the treatment systems were used, which were analyzed and processed using the Life Cycle Assessment (LCA) methodology, in addition to the evaluation of indicators from internationally recognized certification schemes and documents. The LCA methodology was conducted in accordance with ISO 14040 and 14044 standards, using 1 m3 of treated water as the functional unit. A life cycle assessment was carried out for each Wastewater Treatment Plant (WWTP), where 6 impact categories were studied using the ReCiPe 2016 assessment methodology, and compared with each other. Additionally, a sensitivity analysis was conducted with two scenarios: one assessing the impact of reducing electricity consumption by 20%, and another analyzing the impact of using renewable energy sources. The results indicate that the secondary treatment stage has the greatest environmental contribution in most of the analyzed impact categories. Furthermore, it was observed that Canegrate stood out for its low environmental impact, followed by Sesto San Giovanni, and finally Rozzano. On the other hand, the sensitivity analysis demonstrated that energy efficiency and the use of renewable energies help reduce the environmental impact of the WWTPs. Regarding circularity indicators, 31 indicators were selected and divided into 3 general groups: environmental and economic. Following the analysis, it was found that, although all three installations have both strengths and weaknesses, the Canegrate WWTP generally exhibits more circular values, followed by Sesto San Giovanni and, finally, Rozzano.
The aim of this project is to assess the sustainability and circularity potential of three wastewater treatment plants located in Italy, as well as to identify possible alternative scenarios to enhance their environmental and circular performance. Real data from the treatment systems were used, which were analyzed and processed using the Life Cycle Assessment (LCA) methodology, in addition to the evaluation of indicators from internationally recognized certification schemes and documents. The LCA methodology was conducted in accordance with ISO 14040 and 14044 standards, using 1 m3 of treated water as the functional unit. A life cycle assessment was carried out for each Wastewater Treatment Plant (WWTP), where 6 impact categories were studied using the ReCiPe 2016 assessment methodology, and compared with each other. Additionally, a sensitivity analysis was conducted with two scenarios: one assessing the impact of reducing electricity consumption by 20%, and another analyzing the impact of using renewable energy sources. The results indicate that the secondary treatment stage has the greatest environmental contribution in most of the analyzed impact categories. Furthermore, it was observed that Canegrate stood out for its low environmental impact, followed by Sesto San Giovanni, and finally Rozzano. On the other hand, the sensitivity analysis demonstrated that energy efficiency and the use of renewable energies help reduce the environmental impact of the WWTPs. Regarding circularity indicators, 31 indicators were selected and divided into 3 general groups: environmental and economic. Following the analysis, it was found that, although all three installations have both strengths and weaknesses, the Canegrate WWTP generally exhibits more circular values, followed by Sesto San Giovanni and, finally, Rozzano.
Direction
MOREIRA VILAR, MARIA TERESA (Tutorships)
Arias Calvo, Ana (Co-tutorships)
MOREIRA VILAR, MARIA TERESA (Tutorships)
Arias Calvo, Ana (Co-tutorships)
Court
PRIETO LAMAS, BEATRIZ LORETO (Chairman)
BELLO BUGALLO, PASTORA MARIA (Secretary)
PARADELO NUÑEZ, REMIGIO (Member)
PRIETO LAMAS, BEATRIZ LORETO (Chairman)
BELLO BUGALLO, PASTORA MARIA (Secretary)
PARADELO NUÑEZ, REMIGIO (Member)
Analysis and optimisation of the energy consumption in an automotive components plant
Authorship
T.A.P.
Master in Chemical and Bioprocess Engineering
T.A.P.
Master in Chemical and Bioprocess Engineering
Defense date
02.15.2024 10:00
02.15.2024 10:00
Summary
Energy efficiency in industry is a target or goal to reduce energy consumption while maintaining the same activity and productivity. The main objective of this study is the analysis and energy optimisation of a vehicle component production plant to reduce energy consumption. This reduction affects both the economic level, with the reduction of the energy bill, and the environmental level, with the reduction of greenhouse gas emissions. In order to achieve this reduction, a series of optimisation measures are proposed to be implemented in some equipment. The choice of this equipment is based on a global analysis of the plant to identify the equipment with the highest consumption of either natural gas or electricity. These are the infrared ovens in terms of electricity consumption and the boilers in terms of natural gas consumption. Once the equipment is known, an energy analysis of each of them is carried out in order to obtain a first consumption value, both for electricity and natural gas. After the analysis and with the help of checklists, it is observed if the proposed optimisation measures are implemented, not only in the ovens and boilers but also in the plant automation, lighting, air conditioning and pause mode. After implementing some of the measures, a comparative evaluation is carried out to corroborate whether the measures have been effective or not. For the infrared ovens and lighting most of the measures were effective.
Energy efficiency in industry is a target or goal to reduce energy consumption while maintaining the same activity and productivity. The main objective of this study is the analysis and energy optimisation of a vehicle component production plant to reduce energy consumption. This reduction affects both the economic level, with the reduction of the energy bill, and the environmental level, with the reduction of greenhouse gas emissions. In order to achieve this reduction, a series of optimisation measures are proposed to be implemented in some equipment. The choice of this equipment is based on a global analysis of the plant to identify the equipment with the highest consumption of either natural gas or electricity. These are the infrared ovens in terms of electricity consumption and the boilers in terms of natural gas consumption. Once the equipment is known, an energy analysis of each of them is carried out in order to obtain a first consumption value, both for electricity and natural gas. After the analysis and with the help of checklists, it is observed if the proposed optimisation measures are implemented, not only in the ovens and boilers but also in the plant automation, lighting, air conditioning and pause mode. After implementing some of the measures, a comparative evaluation is carried out to corroborate whether the measures have been effective or not. For the infrared ovens and lighting most of the measures were effective.
Direction
RODIL RODRIGUEZ, EVA (Tutorships)
Lopez Rodríguez, Anjara (Co-tutorships)
RODIL RODRIGUEZ, EVA (Tutorships)
Lopez Rodríguez, Anjara (Co-tutorships)
Court
SOTO CAMPOS, ANA MARIA (Chairman)
Balboa Méndez, Sabela (Secretary)
SOUTO GONZALEZ, JOSE ANTONIO (Member)
SOTO CAMPOS, ANA MARIA (Chairman)
Balboa Méndez, Sabela (Secretary)
SOUTO GONZALEZ, JOSE ANTONIO (Member)
Effect of salinity and pH on phosphorus recovery by purple phototrophic bacteria
Authorship
M.G.N.
Master in Chemical and Bioprocess Engineering
M.G.N.
Master in Chemical and Bioprocess Engineering
Defense date
07.09.2024 12:45
07.09.2024 12:45
Summary
Phosphorus is among the resources present in wastewater whose recovery has aroused greater interest because the continuous population growth, the strong demand for food and the massive production of fertilizers threaten to make it a limiting raw material on the planet. Numerous large-scale biological phosphorus removal/recovery processes with PAO (phosphate accumulating organisms) have been implemented in wastewater treatment plants (WWTPs). However, these biological reactors require aeration, significantly raising the operating cost of the plant and are sensitive to decreases in influent organic matter concentration. In addition, the biomass used releases polyphosphate to the medium if exposed to long periods of anaerobiosis. PPB (purple phototrophic bacteria), in particular, some of the PNSB (purple non-sulfur bacteria) type have attracted interest as they have been shown to be PAOs with a unique photoheterotrophic metabolism that accumulate polyphosphate inside the cells under anaerobic conditions, when the organic carbon source of the medium is depleted. However, research on the behavior of these bacteria is required to gain fundamental knowledge of their operating conditions, since there are no articles that analyze the effect of salinity and pH on phosphorus removal with a mixed culture of PNSB. Most of the articles use pure cultures and do not characterize in depth the behavior under anaerobic conditions. In this work, a 2 L anaerobic batch reactor with a light source and acetic acid as an organic carbon source is used for the bacteria to utilize their photoheterotrophic metabolism. The biomass is a mixed culture enriched in PNSB, not adapted to salinity or extreme pH values. First, the effect of salinity has been studied in a range of NaCl concentrations of 8-24 g/L, obtaining lower phosphorus removals as the salinity of the medium increases. Next, the effect of pH was studied, in a range from 4.5 to 9.0. At acidic pH the bacteria did not grow adequately. However, significant eliminations were obtained in tests in which the biomass experienced a pH close to 9.0 in the stationary phase of its growth. In general, in the assays of this work, eliminations in the order of 0.06-0.09 g P removed / g VSS present were obtained. These values are in the typical range of conventional PAOs. The maximum removal achieved was 0.14 g P / g VSS present.
Phosphorus is among the resources present in wastewater whose recovery has aroused greater interest because the continuous population growth, the strong demand for food and the massive production of fertilizers threaten to make it a limiting raw material on the planet. Numerous large-scale biological phosphorus removal/recovery processes with PAO (phosphate accumulating organisms) have been implemented in wastewater treatment plants (WWTPs). However, these biological reactors require aeration, significantly raising the operating cost of the plant and are sensitive to decreases in influent organic matter concentration. In addition, the biomass used releases polyphosphate to the medium if exposed to long periods of anaerobiosis. PPB (purple phototrophic bacteria), in particular, some of the PNSB (purple non-sulfur bacteria) type have attracted interest as they have been shown to be PAOs with a unique photoheterotrophic metabolism that accumulate polyphosphate inside the cells under anaerobic conditions, when the organic carbon source of the medium is depleted. However, research on the behavior of these bacteria is required to gain fundamental knowledge of their operating conditions, since there are no articles that analyze the effect of salinity and pH on phosphorus removal with a mixed culture of PNSB. Most of the articles use pure cultures and do not characterize in depth the behavior under anaerobic conditions. In this work, a 2 L anaerobic batch reactor with a light source and acetic acid as an organic carbon source is used for the bacteria to utilize their photoheterotrophic metabolism. The biomass is a mixed culture enriched in PNSB, not adapted to salinity or extreme pH values. First, the effect of salinity has been studied in a range of NaCl concentrations of 8-24 g/L, obtaining lower phosphorus removals as the salinity of the medium increases. Next, the effect of pH was studied, in a range from 4.5 to 9.0. At acidic pH the bacteria did not grow adequately. However, significant eliminations were obtained in tests in which the biomass experienced a pH close to 9.0 in the stationary phase of its growth. In general, in the assays of this work, eliminations in the order of 0.06-0.09 g P removed / g VSS present were obtained. These values are in the typical range of conventional PAOs. The maximum removal achieved was 0.14 g P / g VSS present.
Direction
MOSQUERA CORRAL, ANUSKA (Tutorships)
VAL DEL RIO, MARIA ANGELES (Co-tutorships)
Pedrouso Fuentes, Alba (Co-tutorships)
MOSQUERA CORRAL, ANUSKA (Tutorships)
VAL DEL RIO, MARIA ANGELES (Co-tutorships)
Pedrouso Fuentes, Alba (Co-tutorships)
Court
CASARES LONG, JUAN JOSE (Chairman)
EIBES GONZALEZ, GEMMA MARIA (Secretary)
FRANCO URIA, MARIA AMAYA (Member)
CASARES LONG, JUAN JOSE (Chairman)
EIBES GONZALEZ, GEMMA MARIA (Secretary)
FRANCO URIA, MARIA AMAYA (Member)
Volatile fatty acids production from xylan by anaerobic fermentation in mixed culture
Authorship
C.M.F.
Master in Environmental Engineering (3rd ed)
C.M.F.
Master in Environmental Engineering (3rd ed)
Defense date
03.01.2024 13:45
03.01.2024 13:45
Summary
Industrial processes are responsible for up to 24% of global greenhouse gas emissions, mainly due to the increasing exploitation of fossil fuels and other non-renewable resources to meet the growing demand for products. Concerns about the potential effects of these emissions on climate change have led to research and development into the use of alternative resources to mitigate their impacts, such as the valorization of organic waste for the production of high-value fuels and chemicals. Anaerobic digestion is a widely studied and used alternative for this purpose, as it has shown great potential for energy recovery efficiently and at low cost. However, the main product of anaerobic digestion, methane, is still considered a low-value commodity. Therefore, there has been recent interest in avoid methanogenesis to obtain higher-value chemicals such as ethanol, butanol, lactate, Volatile Fatty Acids (VFAs), and Medium-Chain Fatty Acids (MCFAs). VFAs and MCFAs are interesting due to their wide range of applications, being required by industries such as food, pharmaceuticals, cosmetics, and chemicals. So far, the use of agro-industrial waste for the production of MCFAs has been limited to those with high ethanol or lactate content, as they are essential for the MCFA production process. A new challenge is to expand the types of waste materials to be used, incorporating abundant fractions in nature, such as lignocellulosic residues, but with limitations such as the difficulty in cellulose hydrolysis and low in situ lactate production. This work starts from a model substrate (xylan), complex enough to consider the hydrolysis stage within the overall process but not as complex as cellulose, as optimizing hydrolysis would be a work itself. In this way, the aim is to optimize the production of MCFAs by considering all stages of the process (hydrolysis, acidogenesis, and chain elongation) through the selection of operating conditions, particularly Hydraulic Retention Time (HRT), and comparing two operation modes (continuous and sequential batch). The results indicate that substrate hydrolysis is not a limiting stage; however, chain elongation process to obtain MCFAs from VFAs is limiting the global rate. The optimal HRT is found between 2 and 4 days; a decrease below 1 day limits the chain elongation kinetics, strongly compromising MCFA yield. Although selecting HRT as an optimization parameter was successful as clear trends in caproic acid yield were observed in both reactors, its direct relationship with parameters such as Organic Loading Rate (OLR), Solids Retention Time (SRT), and biomass activity made result interpretation difficult, especially when trying to attribute observed trends to changes in one parameter or another. Regarding the design of a new experiment, one possible improvement alternative is to fix HRT at a value between 2 and 4 days and optimize SRT by controlling reactor biomass through purges. Studying the influence of these parameters independently will require greater effort and therefore more resources, but it would increase understanding.
Industrial processes are responsible for up to 24% of global greenhouse gas emissions, mainly due to the increasing exploitation of fossil fuels and other non-renewable resources to meet the growing demand for products. Concerns about the potential effects of these emissions on climate change have led to research and development into the use of alternative resources to mitigate their impacts, such as the valorization of organic waste for the production of high-value fuels and chemicals. Anaerobic digestion is a widely studied and used alternative for this purpose, as it has shown great potential for energy recovery efficiently and at low cost. However, the main product of anaerobic digestion, methane, is still considered a low-value commodity. Therefore, there has been recent interest in avoid methanogenesis to obtain higher-value chemicals such as ethanol, butanol, lactate, Volatile Fatty Acids (VFAs), and Medium-Chain Fatty Acids (MCFAs). VFAs and MCFAs are interesting due to their wide range of applications, being required by industries such as food, pharmaceuticals, cosmetics, and chemicals. So far, the use of agro-industrial waste for the production of MCFAs has been limited to those with high ethanol or lactate content, as they are essential for the MCFA production process. A new challenge is to expand the types of waste materials to be used, incorporating abundant fractions in nature, such as lignocellulosic residues, but with limitations such as the difficulty in cellulose hydrolysis and low in situ lactate production. This work starts from a model substrate (xylan), complex enough to consider the hydrolysis stage within the overall process but not as complex as cellulose, as optimizing hydrolysis would be a work itself. In this way, the aim is to optimize the production of MCFAs by considering all stages of the process (hydrolysis, acidogenesis, and chain elongation) through the selection of operating conditions, particularly Hydraulic Retention Time (HRT), and comparing two operation modes (continuous and sequential batch). The results indicate that substrate hydrolysis is not a limiting stage; however, chain elongation process to obtain MCFAs from VFAs is limiting the global rate. The optimal HRT is found between 2 and 4 days; a decrease below 1 day limits the chain elongation kinetics, strongly compromising MCFA yield. Although selecting HRT as an optimization parameter was successful as clear trends in caproic acid yield were observed in both reactors, its direct relationship with parameters such as Organic Loading Rate (OLR), Solids Retention Time (SRT), and biomass activity made result interpretation difficult, especially when trying to attribute observed trends to changes in one parameter or another. Regarding the design of a new experiment, one possible improvement alternative is to fix HRT at a value between 2 and 4 days and optimize SRT by controlling reactor biomass through purges. Studying the influence of these parameters independently will require greater effort and therefore more resources, but it would increase understanding.
Direction
CARBALLA ARCOS, MARTA (Tutorships)
Iglesias Riobó, Juan (Co-tutorships)
CARBALLA ARCOS, MARTA (Tutorships)
Iglesias Riobó, Juan (Co-tutorships)
Court
GONZALEZ GARCIA, SARA (Coordinator)
FIOL LOPEZ, SARAH (Chairman)
Rojo Alboreca, Alberto (Member)
GONZALEZ GARCIA, SARA (Coordinator)
FIOL LOPEZ, SARAH (Chairman)
Rojo Alboreca, Alberto (Member)
Optimization of volatile fatty acids production for the synthesis of biopolymers through the valorization of complex organic waste
Authorship
I.A.C.M.
Master in Chemical and Bioprocess Engineering
I.A.C.M.
Master in Chemical and Bioprocess Engineering
Defense date
02.15.2024 10:30
02.15.2024 10:30
Summary
The exponential surge in global population has led to an exponential growth in the demand for resources and the amount of organic waste produced. One approach to managing waste streams is through anaerobic digestion, an adaptable and environmentally friendly method. This procedure involves four steps: hydrolysis, acidogenesis, acetogenesis and methanogenesis. Aligning with the directives outlines in the European Union’s Waste Framework, a transition from recycling to product retrieval is feasible by inhibiting methanogenesis. Volatile Fatty Acids (VFAs) generated during acidogenesis and acetogenesis can serve as a carbon source for the synthesis of polyhydroxyalkanoates (PHAs), biopolymers possessing plastic and biodegradable attributes, thereby having a higher market value. This study assesses the potential of VFAs production from lipid streams and studies the composition of PHAs that could be produced with the obtained VFA mixtures. To this end, three batch trials were conducted in 400 mL bottles with different substrates (glucose, oleic acid and cooking oil). In the initial trial, glucose was used as a synthetic parameter, along with residual oil. In this trial, total methanogenesis inhibition occurred, and nearly 50% inhibition was observed in the other stages due to free ammonia. In the second trial, the inoculum was changed, and glucose was completely consumed, but the oil did not completely solubilize, remaining as fat flakes floating on the liquid surface. However, for both substrates, all the organic matter in soluble form vas VFAs, with acetic and butyric acids being the major compounds in the case of glucose, and acetic and propionic acids in the case of cooking oil. The objective of the third trial was to study the effect of hydrolysis using oleic acid, in addition to two oils streams with lower concentrations. Oleic acid also could not be solubilized, and the same fat flakes were observed floating on the surface. One of the lower-concentration oil streams formed VFAs with a higher proportion of acetic acid than in trial 2, followed again with propionic acid. The other stream was outside the minimum measurement range of the equipment. In terms of Chemical Oxygen Demand (COD) units, the highest amount of VFAs produced by the oil streams in trials 1, 2 and 3 were 0,63 g COD/L, 2,34 g COD/L and 0,14 g COD/L, respectively. In trial 3, with oleic acid, 1,38 g COD/L was obtained. An increase in VFA production was achieved by changing the inoculum; however, all three optimization attempts in trial 3 failed, Lipid hydrolysis depends too much on the available surface area, and the formation of fat flakes on the liquid surface hindered hydrolysis, resulting in a maximum of 23% VFAs achieved from the introduced COD for oil and 13% for oleic acid. Nevertheless, the specific VFAs obtained in each trial were different, indicating the potential for obtaining materials with different bioplastic properties from residual oil by adapting the operating mode.
The exponential surge in global population has led to an exponential growth in the demand for resources and the amount of organic waste produced. One approach to managing waste streams is through anaerobic digestion, an adaptable and environmentally friendly method. This procedure involves four steps: hydrolysis, acidogenesis, acetogenesis and methanogenesis. Aligning with the directives outlines in the European Union’s Waste Framework, a transition from recycling to product retrieval is feasible by inhibiting methanogenesis. Volatile Fatty Acids (VFAs) generated during acidogenesis and acetogenesis can serve as a carbon source for the synthesis of polyhydroxyalkanoates (PHAs), biopolymers possessing plastic and biodegradable attributes, thereby having a higher market value. This study assesses the potential of VFAs production from lipid streams and studies the composition of PHAs that could be produced with the obtained VFA mixtures. To this end, three batch trials were conducted in 400 mL bottles with different substrates (glucose, oleic acid and cooking oil). In the initial trial, glucose was used as a synthetic parameter, along with residual oil. In this trial, total methanogenesis inhibition occurred, and nearly 50% inhibition was observed in the other stages due to free ammonia. In the second trial, the inoculum was changed, and glucose was completely consumed, but the oil did not completely solubilize, remaining as fat flakes floating on the liquid surface. However, for both substrates, all the organic matter in soluble form vas VFAs, with acetic and butyric acids being the major compounds in the case of glucose, and acetic and propionic acids in the case of cooking oil. The objective of the third trial was to study the effect of hydrolysis using oleic acid, in addition to two oils streams with lower concentrations. Oleic acid also could not be solubilized, and the same fat flakes were observed floating on the surface. One of the lower-concentration oil streams formed VFAs with a higher proportion of acetic acid than in trial 2, followed again with propionic acid. The other stream was outside the minimum measurement range of the equipment. In terms of Chemical Oxygen Demand (COD) units, the highest amount of VFAs produced by the oil streams in trials 1, 2 and 3 were 0,63 g COD/L, 2,34 g COD/L and 0,14 g COD/L, respectively. In trial 3, with oleic acid, 1,38 g COD/L was obtained. An increase in VFA production was achieved by changing the inoculum; however, all three optimization attempts in trial 3 failed, Lipid hydrolysis depends too much on the available surface area, and the formation of fat flakes on the liquid surface hindered hydrolysis, resulting in a maximum of 23% VFAs achieved from the introduced COD for oil and 13% for oleic acid. Nevertheless, the specific VFAs obtained in each trial were different, indicating the potential for obtaining materials with different bioplastic properties from residual oil by adapting the operating mode.
Direction
MOSQUERA CORRAL, ANUSKA (Tutorships)
VAL DEL RIO, MARIA ANGELES (Co-tutorships)
Pedrouso Fuentes, Alba (Co-tutorships)
MOSQUERA CORRAL, ANUSKA (Tutorships)
VAL DEL RIO, MARIA ANGELES (Co-tutorships)
Pedrouso Fuentes, Alba (Co-tutorships)
Court
SOTO CAMPOS, ANA MARIA (Chairman)
Balboa Méndez, Sabela (Secretary)
SOUTO GONZALEZ, JOSE ANTONIO (Member)
SOTO CAMPOS, ANA MARIA (Chairman)
Balboa Méndez, Sabela (Secretary)
SOUTO GONZALEZ, JOSE ANTONIO (Member)
Integrating Streaming Processing in an HPC-Big Data Framework for its application in the field of Bioinformatics.
Authorship
A.A.K.
Máster Universitario en Computación de Altas Prestaciones / High Performance Computing from the University of A Coruña and the University of Santiago de Compostela
A.A.K.
Máster Universitario en Computación de Altas Prestaciones / High Performance Computing from the University of A Coruña and the University of Santiago de Compostela
Defense date
07.03.2024 11:00
07.03.2024 11:00
Summary
The main objective of IgnisHPC [1] is to unify in the same framework the development, the combination and the execution of HPC and Big Data applications and execution of HPC and Big Data applications using different programming languages and models. programming models. At present, it supports multi-language applications implemented in C/C++, Python, Java and Go. Although IgnisHPC has demonstrated its superiority in terms of performance over other standard frameworks such as Hadoop or Spark, when it comes to processing or Spark, when it comes to processing large amounts of data, it needs to perform a prior static distribution of the data and does not static distribution of the data and does not support streaming processing. [1] César Piñeiro and Juan C. Pichel. A Unified Framework to Improve the Interoperability between HPC and Big Data Languages and Big Data. between HPC and Big Data Languages and Programming Models. Future Generation Computer Systems, Vol. 134, pages 123-139, 2022
The main objective of IgnisHPC [1] is to unify in the same framework the development, the combination and the execution of HPC and Big Data applications and execution of HPC and Big Data applications using different programming languages and models. programming models. At present, it supports multi-language applications implemented in C/C++, Python, Java and Go. Although IgnisHPC has demonstrated its superiority in terms of performance over other standard frameworks such as Hadoop or Spark, when it comes to processing or Spark, when it comes to processing large amounts of data, it needs to perform a prior static distribution of the data and does not static distribution of the data and does not support streaming processing. [1] César Piñeiro and Juan C. Pichel. A Unified Framework to Improve the Interoperability between HPC and Big Data Languages and Big Data. between HPC and Big Data Languages and Programming Models. Future Generation Computer Systems, Vol. 134, pages 123-139, 2022
Direction
PICHEL CAMPOS, JUAN CARLOS (Tutorships)
PIÑEIRO POMAR, CESAR ALFREDO (Co-tutorships)
PICHEL CAMPOS, JUAN CARLOS (Tutorships)
PIÑEIRO POMAR, CESAR ALFREDO (Co-tutorships)
Court
GARCIA LOUREIRO, ANTONIO JESUS (Chairman)
QUESADA BARRIUSO, PABLO (Secretary)
Andión Fernández, José Manuel (Member)
GARCIA LOUREIRO, ANTONIO JESUS (Chairman)
QUESADA BARRIUSO, PABLO (Secretary)
Andión Fernández, José Manuel (Member)
Comparative Study of Event Streaming Platforms: Apache Kafka, Apache Pulsar, and Redpanda
Authorship
R.A.P.
Master in Masive Data Analisys Tecnologies: Big Data
R.A.P.
Master in Masive Data Analisys Tecnologies: Big Data
Defense date
07.16.2024 09:00
07.16.2024 09:00
Summary
Event streaming systems have emerged as a fundamental piece in modern data architectures, allowing for the efficient ingestion of large volumes of data and providing near real-time results. Among the most prominent tools in this field are Apache Kafka, Apache Pulsar, and RedPanda. This work aims to conduct a comprehensive analysis of these platforms, exploring both their strengths and weaknesses through the study of their internal architectures, features, and performance. The aspects to be analyzed for each tool will be the following: scalability, delivery guarantee and recovery capacity in case of failures, latency, ease of use, community support, cost. Finally, compatibility between these technologies will also be evaluated, aiming to identify if integration between them is possible and if there are specific limitations or requirements for their interoperability.
Event streaming systems have emerged as a fundamental piece in modern data architectures, allowing for the efficient ingestion of large volumes of data and providing near real-time results. Among the most prominent tools in this field are Apache Kafka, Apache Pulsar, and RedPanda. This work aims to conduct a comprehensive analysis of these platforms, exploring both their strengths and weaknesses through the study of their internal architectures, features, and performance. The aspects to be analyzed for each tool will be the following: scalability, delivery guarantee and recovery capacity in case of failures, latency, ease of use, community support, cost. Finally, compatibility between these technologies will also be evaluated, aiming to identify if integration between them is possible and if there are specific limitations or requirements for their interoperability.
Direction
RIOS VIQUEIRA, JOSE RAMON (Tutorships)
Martínez Álvarez, Rafael (Co-tutorships)
RIOS VIQUEIRA, JOSE RAMON (Tutorships)
Martínez Álvarez, Rafael (Co-tutorships)
Court
Losada Carril, David Enrique (Chairman)
Blanco Heras, Dora (Secretary)
AMEIJEIRAS ALONSO, JOSE (Member)
Losada Carril, David Enrique (Chairman)
Blanco Heras, Dora (Secretary)
AMEIJEIRAS ALONSO, JOSE (Member)
Desarrollo de algoritmos de active learning para la mejora de modelos de detección de Deepfake
Authorship
A.J.L.L.
Master in Masive Data Analisys Tecnologies: Big Data
A.J.L.L.
Master in Masive Data Analisys Tecnologies: Big Data
Defense date
07.15.2024 16:30
07.15.2024 16:30
Summary
In this study, the development of Active Learning algorithms was investigated to improve deep fake detection models. For this purpose, the state of the art in Active Learning techniques and deep fake detection was reviewed, and the tools MMFlow and DVC were used on the Celeb-DF (v2) dataset to perform comparisons and train models adapted to deep fake detection. The results indicated that Active Learning techniques significantly enhance the accuracy and efficiency of deep fake detection models compared to traditional methods. However, it was also observed that computational complexity and training time might increase, although this increase is offset by the improved capability of the models to detect deep fakes in contexts with large volumes of data and variability. This work highlights the feasibility and effectiveness of Active Learning as a promising solution to optimize deep fake detection processes, offering a valuable tool to address security, privacy, and misinformation challenges in the digital age.
In this study, the development of Active Learning algorithms was investigated to improve deep fake detection models. For this purpose, the state of the art in Active Learning techniques and deep fake detection was reviewed, and the tools MMFlow and DVC were used on the Celeb-DF (v2) dataset to perform comparisons and train models adapted to deep fake detection. The results indicated that Active Learning techniques significantly enhance the accuracy and efficiency of deep fake detection models compared to traditional methods. However, it was also observed that computational complexity and training time might increase, although this increase is offset by the improved capability of the models to detect deep fakes in contexts with large volumes of data and variability. This work highlights the feasibility and effectiveness of Active Learning as a promising solution to optimize deep fake detection processes, offering a valuable tool to address security, privacy, and misinformation challenges in the digital age.
Direction
MUCIENTES MOLINA, MANUEL FELIPE (Tutorships)
Morera Pérez, Juan José (Co-tutorships)
MUCIENTES MOLINA, MANUEL FELIPE (Tutorships)
Morera Pérez, Juan José (Co-tutorships)
Court
Argüello Pedreira, Francisco Santiago (Chairman)
LAMA PENIN, MANUEL (Secretary)
FELIX LAMAS, PAULO (Member)
Argüello Pedreira, Francisco Santiago (Chairman)
LAMA PENIN, MANUEL (Secretary)
FELIX LAMAS, PAULO (Member)
Effect of In/V molar ratio in In2O3/V2O5 catalyst over it's activity and selectivity for the hydrogenation of CO2 to methanol
Authorship
C.P.P.S.
Master in Chemical and Bioprocess Engineering
C.P.P.S.
Master in Chemical and Bioprocess Engineering
Defense date
02.15.2024 11:30
02.15.2024 11:30
Summary
Hydrogenation of CO2 to methanol is a promising alternative to achieve decarbonization and mitigate climate change. So, it’s important to find stable, active, and selective catalyst. In this work In2O3/V2O5, In2O3/V2O5/SiO2 and In2O3/V2O5/ZrO2 catalyst are studied to show the effect of the change in the molar ratio V/In and the nature of the support on interaction between phases and its influence on the generation of oxygen vacancies, considered as the active sites for methanol formation. In2O3/V2O5 catalyst were synthesized by coprecipitation and In2O3/V2O5, In2O3/ZrO2, In2O3/SiO2, In2O3/V2O5/ZrO2 and In2O3/V2O5/SiO2 by wet impregnation, changing V/In molar ratio. The materials were characterized by XRD, TPR-H2 and thermogravimetry. Activity test were conducted in a differential bed reactor at an absolute pressure of 8 bar, H2:CO2 flow ratio of 4 and temperatures between 220 and 280 ºC. XRD characterization showed the existence of the cubic crystalline phase of In2O3, monoclinic phase of ZrO2 and orthorhombic phase of V2O5. Using the Scherrer equation, average crystal sizes of In2O3 between 18 and 30 nm were obtained. An increase of the In2O3 crystal size was observed with the rise of the V/In ratio for SiO2 supported catalyst. Surface area showed that zirconium oxide used to synthetize InZr has an area of 61 m2/g while ZrO2 used for synthetizing 0,33VInZr and VInZr has an area of 4m2/g, making the activity results for these two last catalysts not comparable with InZr, but they allowed the evaluation of the effect of the V/In ratio. TPR-H2 and thermogravimetry analysis showed generation of oxygen vacancies in In2O3 under the studied reaction conditions. Furthermore, an increase in the V/In ratio of the catalyst led to higher H2 consumption and make easier the reduction of In2O3. A decrease in the apparent activation energy for methanol formation was observed in catalyst with higher V/In ratio. Additionally, the yield and selectivity towards methanol was improved with the presence of vanadium for SiO2 and ZrO2 supported catalyst, because of the better activity of the oxygen vacancy as the consequence of a better interaction between reducible oxides. Studies like these contribute the understanding of the surface characteristics of catalyst and their impact in the catalytic reactions. Nevertheless, further research is needed to achieve the design of more efficient catalyst.
Hydrogenation of CO2 to methanol is a promising alternative to achieve decarbonization and mitigate climate change. So, it’s important to find stable, active, and selective catalyst. In this work In2O3/V2O5, In2O3/V2O5/SiO2 and In2O3/V2O5/ZrO2 catalyst are studied to show the effect of the change in the molar ratio V/In and the nature of the support on interaction between phases and its influence on the generation of oxygen vacancies, considered as the active sites for methanol formation. In2O3/V2O5 catalyst were synthesized by coprecipitation and In2O3/V2O5, In2O3/ZrO2, In2O3/SiO2, In2O3/V2O5/ZrO2 and In2O3/V2O5/SiO2 by wet impregnation, changing V/In molar ratio. The materials were characterized by XRD, TPR-H2 and thermogravimetry. Activity test were conducted in a differential bed reactor at an absolute pressure of 8 bar, H2:CO2 flow ratio of 4 and temperatures between 220 and 280 ºC. XRD characterization showed the existence of the cubic crystalline phase of In2O3, monoclinic phase of ZrO2 and orthorhombic phase of V2O5. Using the Scherrer equation, average crystal sizes of In2O3 between 18 and 30 nm were obtained. An increase of the In2O3 crystal size was observed with the rise of the V/In ratio for SiO2 supported catalyst. Surface area showed that zirconium oxide used to synthetize InZr has an area of 61 m2/g while ZrO2 used for synthetizing 0,33VInZr and VInZr has an area of 4m2/g, making the activity results for these two last catalysts not comparable with InZr, but they allowed the evaluation of the effect of the V/In ratio. TPR-H2 and thermogravimetry analysis showed generation of oxygen vacancies in In2O3 under the studied reaction conditions. Furthermore, an increase in the V/In ratio of the catalyst led to higher H2 consumption and make easier the reduction of In2O3. A decrease in the apparent activation energy for methanol formation was observed in catalyst with higher V/In ratio. Additionally, the yield and selectivity towards methanol was improved with the presence of vanadium for SiO2 and ZrO2 supported catalyst, because of the better activity of the oxygen vacancy as the consequence of a better interaction between reducible oxides. Studies like these contribute the understanding of the surface characteristics of catalyst and their impact in the catalytic reactions. Nevertheless, further research is needed to achieve the design of more efficient catalyst.
Direction
RODRIGUEZ MARTINEZ, HECTOR (Tutorships)
Jimenez Concepción, Romel Mario (Co-tutorships)
RODRIGUEZ MARTINEZ, HECTOR (Tutorships)
Jimenez Concepción, Romel Mario (Co-tutorships)
Court
SOTO CAMPOS, ANA MARIA (Chairman)
Balboa Méndez, Sabela (Secretary)
SOUTO GONZALEZ, JOSE ANTONIO (Member)
SOTO CAMPOS, ANA MARIA (Chairman)
Balboa Méndez, Sabela (Secretary)
SOUTO GONZALEZ, JOSE ANTONIO (Member)
Deep Learning Application on Radio Frequency Signals Classification
Authorship
M.A.
Master in artificial intelligence
M.A.
Master in artificial intelligence
Defense date
09.12.2024 10:00
09.12.2024 10:00
Summary
This study introduces a one-dimensional Deep Learning (DL) model for radio frequency (RF) classification, incorporating components such as ResNet, Squeeze-and-Excitation blocks, and Depthwise Separable Convolutions. The model aims to classify RF signals from 18 classes, including single-carrier QAM and PSK families and multi-carrier OFDM classes, under realistic conditions. Torchsig toolkit was used to generate synthetic data, simulating real-world scenarios with various impairments such as additive white Gaussian noise, phase shifts, time shifts, frequency shifts, and multipath fading. These impairments create challenging conditions, making the classification task more robust and realistic. The proposed model's performance was evaluated and compared with other advanced models, such as EfficientNet and XCiT. Our model showed competitive accuracy and efficient inference times, demonstrating its potential despite the constraints of data volume and computational power. The results indicate promising avenues for further refinement. Future work will focus on improving the model's performance through data augmentation and real-world testing, aiming for reliable and timely RF signal classification in practical applications.
This study introduces a one-dimensional Deep Learning (DL) model for radio frequency (RF) classification, incorporating components such as ResNet, Squeeze-and-Excitation blocks, and Depthwise Separable Convolutions. The model aims to classify RF signals from 18 classes, including single-carrier QAM and PSK families and multi-carrier OFDM classes, under realistic conditions. Torchsig toolkit was used to generate synthetic data, simulating real-world scenarios with various impairments such as additive white Gaussian noise, phase shifts, time shifts, frequency shifts, and multipath fading. These impairments create challenging conditions, making the classification task more robust and realistic. The proposed model's performance was evaluated and compared with other advanced models, such as EfficientNet and XCiT. Our model showed competitive accuracy and efficient inference times, demonstrating its potential despite the constraints of data volume and computational power. The results indicate promising avenues for further refinement. Future work will focus on improving the model's performance through data augmentation and real-world testing, aiming for reliable and timely RF signal classification in practical applications.
Direction
López Martínez, Paula (Tutorships)
López Martínez, Paula (Tutorships)
Court
BUGARIN DIZ, ALBERTO JOSE (Chairman)
MERA PEREZ, DAVID (Secretary)
CARIÑENA AMIGO, MARIA PURIFICACION (Member)
BUGARIN DIZ, ALBERTO JOSE (Chairman)
MERA PEREZ, DAVID (Secretary)
CARIÑENA AMIGO, MARIA PURIFICACION (Member)
GAN-based data augmentation for the classification of remote sensing multiespectral images
Authorship
V.X.B.D.
Master in artificial intelligence
V.X.B.D.
Master in artificial intelligence
Defense date
02.21.2024 12:00
02.21.2024 12:00
Summary
Multispectral images frequently suffer from limited labeled data, which constrains the accuracy of classification. The objective of data augmentation is to improve the performance of machine learning models by artificially increasing the size of the training dataset. This paper proposes mDAGAN, a data augmentation technique based on a Generative Adversarial Network (GAN) for the classification of high resolution multispectral remote sensing images. The proposed technique is based a previous GAN-based model and intended to be agnostic to the chosen classification technique. The augmentation capacity of mDAGAN for different classification algorithms has been evaluated over three multispectral images of vegetation providing increased classification accuracies. It also represents an alternative based on the definition of similarity of samples and not on labels, which makes adaptation to regression problems possible.
Multispectral images frequently suffer from limited labeled data, which constrains the accuracy of classification. The objective of data augmentation is to improve the performance of machine learning models by artificially increasing the size of the training dataset. This paper proposes mDAGAN, a data augmentation technique based on a Generative Adversarial Network (GAN) for the classification of high resolution multispectral remote sensing images. The proposed technique is based a previous GAN-based model and intended to be agnostic to the chosen classification technique. The augmentation capacity of mDAGAN for different classification algorithms has been evaluated over three multispectral images of vegetation providing increased classification accuracies. It also represents an alternative based on the definition of similarity of samples and not on labels, which makes adaptation to regression problems possible.
Direction
Blanco Heras, Dora (Tutorships)
Argüello Pedreira, Francisco Santiago (Co-tutorships)
GOLDAR DIESTE, ALVARO (Co-tutorships)
Blanco Heras, Dora (Tutorships)
Argüello Pedreira, Francisco Santiago (Co-tutorships)
GOLDAR DIESTE, ALVARO (Co-tutorships)
Court
Barro Ameneiro, Senén (Chairman)
ALONSO MORAL, JOSE MARIA (Secretary)
Cotos Yáñez, José Manuel (Member)
Barro Ameneiro, Senén (Chairman)
ALONSO MORAL, JOSE MARIA (Secretary)
Cotos Yáñez, José Manuel (Member)
Rule extraction for process mining based on machine learning techniques
Authorship
T.B.A.
Master in artificial intelligence
T.B.A.
Master in artificial intelligence
Defense date
02.21.2024 12:30
02.21.2024 12:30
Summary
Process mining is a discipline that has been gaining importance by offering a set of techniques that allow extracting knowledge from the event logs in which the information generated in the execution of processes is stored. One of the main objectives in process mining is to understand what has happened during the execution of a process. Typically, this goal is achieved by manually exploring the actual model, describing the behaviour of the process and temporal and frequency analytics on its variants and business indicators. In this paper, an innovative approach based on decision trees is presented that allows the automatic classification of certain behaviours that occur during a process based on the information generated during its executions and the variables associated with them, so that process stakeholders can have a better understanding of what is going on and thus improve decision making. This technique has been validated on a medical process, the Aortic Stenosis Integrated Care Process (AS ICP) implemented in the Cardiology Department of the University Hospital of Santiago de Compostela. On this process, the waiting times of patients have been tackled in order to extract those patient profiles susceptible to delays or to be prioritised.
Process mining is a discipline that has been gaining importance by offering a set of techniques that allow extracting knowledge from the event logs in which the information generated in the execution of processes is stored. One of the main objectives in process mining is to understand what has happened during the execution of a process. Typically, this goal is achieved by manually exploring the actual model, describing the behaviour of the process and temporal and frequency analytics on its variants and business indicators. In this paper, an innovative approach based on decision trees is presented that allows the automatic classification of certain behaviours that occur during a process based on the information generated during its executions and the variables associated with them, so that process stakeholders can have a better understanding of what is going on and thus improve decision making. This technique has been validated on a medical process, the Aortic Stenosis Integrated Care Process (AS ICP) implemented in the Cardiology Department of the University Hospital of Santiago de Compostela. On this process, the waiting times of patients have been tackled in order to extract those patient profiles susceptible to delays or to be prioritised.
Direction
LAMA PENIN, MANUEL (Tutorships)
MUCIENTES MOLINA, MANUEL FELIPE (Co-tutorships)
LAMA PENIN, MANUEL (Tutorships)
MUCIENTES MOLINA, MANUEL FELIPE (Co-tutorships)
Court
Barro Ameneiro, Senén (Chairman)
ALONSO MORAL, JOSE MARIA (Secretary)
Cotos Yáñez, José Manuel (Member)
Barro Ameneiro, Senén (Chairman)
ALONSO MORAL, JOSE MARIA (Secretary)
Cotos Yáñez, José Manuel (Member)
Towards Automated Readability Assessment for Spanish Texts
Authorship
A.B.B.
Master in artificial intelligence
A.B.B.
Master in artificial intelligence
Defense date
02.20.2024 11:30
02.20.2024 11:30
Summary
This work addresses the automatic readability classification of Spanish texts, focus on classification models based on the Common European Framework of Reference for Languages, specifically at the A1, A2 and B1 readability levels. The methodology approached the problem by involving feature extraction for Machine Learning models and the tuning of Transformers for text classification, comparing both on the same corpus. Although the results are aligned with the state of the art in other languages, they call for further research as there is still room for improvement. The best model is a decision tree with an accuracy of 0.49 and a kappa of 0.24, using the set of features with the highest correlation (evaluated with Spearman rank correlation algorithm) with the target variable, CEFR level. Future work will explore further approaches to improve the performance, a web-based system for classifying readability in Spanish and an API for automated feature extraction, being a valuable resource for NLP research and language learning.
This work addresses the automatic readability classification of Spanish texts, focus on classification models based on the Common European Framework of Reference for Languages, specifically at the A1, A2 and B1 readability levels. The methodology approached the problem by involving feature extraction for Machine Learning models and the tuning of Transformers for text classification, comparing both on the same corpus. Although the results are aligned with the state of the art in other languages, they call for further research as there is still room for improvement. The best model is a decision tree with an accuracy of 0.49 and a kappa of 0.24, using the set of features with the highest correlation (evaluated with Spearman rank correlation algorithm) with the target variable, CEFR level. Future work will explore further approaches to improve the performance, a web-based system for classifying readability in Spanish and an API for automated feature extraction, being a valuable resource for NLP research and language learning.
Direction
CATALA BOLOS, ALEJANDRO (Tutorships)
GARCIA GONZALEZ, MARCOS (Co-tutorships)
CATALA BOLOS, ALEJANDRO (Tutorships)
GARCIA GONZALEZ, MARCOS (Co-tutorships)
Court
MUCIENTES MOLINA, MANUEL FELIPE (Chairman)
ORDOÑEZ IGLESIAS, ALVARO (Secretary)
Sánchez Vila, Eduardo Manuel (Member)
MUCIENTES MOLINA, MANUEL FELIPE (Chairman)
ORDOÑEZ IGLESIAS, ALVARO (Secretary)
Sánchez Vila, Eduardo Manuel (Member)
Spiking neural networks for multivariate time series prediction
Authorship
A.D.R.M.
Master in artificial intelligence
A.D.R.M.
Master in artificial intelligence
Defense date
02.21.2024 16:00
02.21.2024 16:00
Summary
In this work, multiple types of spiking neural networks (SNNs) were designed and developed with snnTorch to test their performance in a predictive maintenance problem, specifically with the FD001 and FD004 subsets of the C-MAPSS Jet Engine Simulated Dataset. Classification, regression and an hybrid model (classifier and regressor) were designed and evaluated. Leaky Integrate and Fire (LIF), Recurrent LIF (RLIF) and Spiking LSTM (SLSTM) neurons were tested. A grid search strategy for the SNN models was performed. To compare with the performance of traditional models, equivalent traditional architectures (Artificial Neural Networks (ANN), Recurrent Neural Networks (RNN), Long Short-Term Memory (LSTM)) to the best SNN architectures were designed. SNNs showed great potential due to their observed performance and efficient utilization of time resource. SLSTM neurons proved superior performance in subset FD001 for both classification and regression problems, while LIF neurons exhibited better performance in the more complex subset FD004. Specific hyperparameters, such as delta modulation encoding did not show significant improvements. Hybrid models did not outperform regression models in either subset, with the best regressor featuring early Remaining Useful Life (RUL) preprocessing and the largest window size. Comparisons between traditional models (ANN, RNN, LSTM) and spiking models (LIF, RLIF, SLSTM) highlighted significant performance differences among equivalent architectures. Regarding time comparisons, the traditional equivalent models needed, in general, less time to be trained for the same number of epochs. Spiking and traditional equivalent models have similar sizes in the recurrent and LSTM model types, while the size difference between the LIF and ANN models was significant, the former being smaller than the latter.
In this work, multiple types of spiking neural networks (SNNs) were designed and developed with snnTorch to test their performance in a predictive maintenance problem, specifically with the FD001 and FD004 subsets of the C-MAPSS Jet Engine Simulated Dataset. Classification, regression and an hybrid model (classifier and regressor) were designed and evaluated. Leaky Integrate and Fire (LIF), Recurrent LIF (RLIF) and Spiking LSTM (SLSTM) neurons were tested. A grid search strategy for the SNN models was performed. To compare with the performance of traditional models, equivalent traditional architectures (Artificial Neural Networks (ANN), Recurrent Neural Networks (RNN), Long Short-Term Memory (LSTM)) to the best SNN architectures were designed. SNNs showed great potential due to their observed performance and efficient utilization of time resource. SLSTM neurons proved superior performance in subset FD001 for both classification and regression problems, while LIF neurons exhibited better performance in the more complex subset FD004. Specific hyperparameters, such as delta modulation encoding did not show significant improvements. Hybrid models did not outperform regression models in either subset, with the best regressor featuring early Remaining Useful Life (RUL) preprocessing and the largest window size. Comparisons between traditional models (ANN, RNN, LSTM) and spiking models (LIF, RLIF, SLSTM) highlighted significant performance differences among equivalent architectures. Regarding time comparisons, the traditional equivalent models needed, in general, less time to be trained for the same number of epochs. Spiking and traditional equivalent models have similar sizes in the recurrent and LSTM model types, while the size difference between the LIF and ANN models was significant, the former being smaller than the latter.
Direction
MERA PEREZ, DAVID (Tutorships)
Fernández Castro, Bruno (Co-tutorships)
MERA PEREZ, DAVID (Tutorships)
Fernández Castro, Bruno (Co-tutorships)
Court
CARREIRA NOUCHE, MARIA JOSE (Chairman)
CATALA BOLOS, ALEJANDRO (Secretary)
López Martínez, Paula (Member)
CARREIRA NOUCHE, MARIA JOSE (Chairman)
CATALA BOLOS, ALEJANDRO (Secretary)
López Martínez, Paula (Member)
The Chestnut in the Fragas do Eume
Authorship
I.M.B.R.
Master in Environmental Engineering (3rd ed)
I.M.B.R.
Master in Environmental Engineering (3rd ed)
Defense date
03.01.2024 13:00
03.01.2024 13:00
Summary
Castanea sativa’s population chestnut located in Fragas do Eume natural park PNFE is declining. One of the objectives of this research is to find the most favorable planting spaces for this species, in an attempt of increasing the number of individuals, by using QuantumGis to carry out several analyses on PNFE territory. On another note, and taking into consideration the current situation in terms of climate change, quantificarion of CO2 tons transformated by vegetation is used in order to showcase trees' role in carbon secuestration among other benefits
Castanea sativa’s population chestnut located in Fragas do Eume natural park PNFE is declining. One of the objectives of this research is to find the most favorable planting spaces for this species, in an attempt of increasing the number of individuals, by using QuantumGis to carry out several analyses on PNFE territory. On another note, and taking into consideration the current situation in terms of climate change, quantificarion of CO2 tons transformated by vegetation is used in order to showcase trees' role in carbon secuestration among other benefits
Direction
Rojo Alboreca, Alberto (Tutorships)
Franco Aradas, Carlos (Co-tutorships)
Rojo Alboreca, Alberto (Tutorships)
Franco Aradas, Carlos (Co-tutorships)
Court
FIOL LOPEZ, SARAH (Chairman)
GONZALEZ GARCIA, SARA (Secretary)
CARBALLA ARCOS, MARTA (Member)
FIOL LOPEZ, SARAH (Chairman)
GONZALEZ GARCIA, SARA (Secretary)
CARBALLA ARCOS, MARTA (Member)
Effect of hydraulic retention time on the microbiome in dark fermentation processes
Authorship
C.C.G.
Master in Environmental Engineering (3rd ed)
C.C.G.
Master in Environmental Engineering (3rd ed)
Defense date
09.09.2024 13:00
09.09.2024 13:00
Summary
Dark fermentation is defined as the set of metabolic processes that take place in an anaerobic digestion, independently of light conditions, and that conclude in the production of biogenic hydrogen. The objective of this Master Thesis is to study how different operating conditions tested affect the microbiome of hydrogen digesters, in order to detect potential relationships between microorganism populations and monitorable components. In a first phase, the hydrogenogenic capacity of different substrates (urban and industrial waste) was tested in batch mode, in order to select the most favorable one and proceed to scale-up. Subsequently, in a second phase, it was studied how variations in the hydraulic retention time (HRT) affect the microbiome of continuous reactors. For this purpose, biohydrogen production yields, Volatile Fatty Acids (VFA), nutrients and organic load of the reactors (among other monitoring parameters) were monitored. The analysis of bacterial populations was performed by DNA metabarcoding. Reactor operation results showed unstable hydrogen production. The microbial diversity results showed a high degree of specialization of the microbial community in two major phyla (Actinobacteriota and Firmicutes). In addition, the presence and relative abundance of LAB (Lactic Acid Bacteria) was detected, and this presence could not be related to the cessation of hydrogen production.
Dark fermentation is defined as the set of metabolic processes that take place in an anaerobic digestion, independently of light conditions, and that conclude in the production of biogenic hydrogen. The objective of this Master Thesis is to study how different operating conditions tested affect the microbiome of hydrogen digesters, in order to detect potential relationships between microorganism populations and monitorable components. In a first phase, the hydrogenogenic capacity of different substrates (urban and industrial waste) was tested in batch mode, in order to select the most favorable one and proceed to scale-up. Subsequently, in a second phase, it was studied how variations in the hydraulic retention time (HRT) affect the microbiome of continuous reactors. For this purpose, biohydrogen production yields, Volatile Fatty Acids (VFA), nutrients and organic load of the reactors (among other monitoring parameters) were monitored. The analysis of bacterial populations was performed by DNA metabarcoding. Reactor operation results showed unstable hydrogen production. The microbial diversity results showed a high degree of specialization of the microbial community in two major phyla (Actinobacteriota and Firmicutes). In addition, the presence and relative abundance of LAB (Lactic Acid Bacteria) was detected, and this presence could not be related to the cessation of hydrogen production.
Direction
Balboa Méndez, Sabela (Tutorships)
González Míguez, Alicia (Co-tutorships)
Balboa Méndez, Sabela (Tutorships)
González Míguez, Alicia (Co-tutorships)
Court
MOSQUERA CORRAL, ANUSKA (Chairman)
MONTERROSO MARTINEZ, MARIA DEL CARMEN (Secretary)
BARCIELA ALONSO, Ma CARMEN (Member)
MOSQUERA CORRAL, ANUSKA (Chairman)
MONTERROSO MARTINEZ, MARIA DEL CARMEN (Secretary)
BARCIELA ALONSO, Ma CARMEN (Member)
Assessment of the degree of sustainability and circularity of an industrial wastewater treatment plant
Authorship
B.C.P.
Master in Environmental Engineering (3rd ed)
B.C.P.
Master in Environmental Engineering (3rd ed)
Defense date
06.21.2024 10:00
06.21.2024 10:00
Summary
This research work focuses on the exhaustive analysis of the environmental impact generated by an industrial wastewater treatment plant (WWTP), specifically a tuna canning industry. A simulation tool (Biowin) is used to evaluate the behaviour of the plant by making different changes that can reduce the environmental impact and ensure that the discharge limits are still complied with. The most relevant categories were identified, such as global warming and resource scarcity. It was observed that energy consumption and reagent consumption are the ones that generate the greatest environmental impact, so their critical areas were thoroughly investigated, showing that the aerator of the Sequencing Batch Reactor is the one that consumes the most energy and that aluminium polychloride is the reagent with the greatest environmental impact. Due to this, strategies to improve the efficiency and reduce the environmental impact of the treatment system were discussed, such as optimising the use of reagents and evaluating alternative energy sources. The results obtained underline the complexity of environmental management in wastewater treatment and highlight the necessity for innovative approaches to face these challenges, providing a global vision that can guide future research and practice in the field of wastewater treatment.
This research work focuses on the exhaustive analysis of the environmental impact generated by an industrial wastewater treatment plant (WWTP), specifically a tuna canning industry. A simulation tool (Biowin) is used to evaluate the behaviour of the plant by making different changes that can reduce the environmental impact and ensure that the discharge limits are still complied with. The most relevant categories were identified, such as global warming and resource scarcity. It was observed that energy consumption and reagent consumption are the ones that generate the greatest environmental impact, so their critical areas were thoroughly investigated, showing that the aerator of the Sequencing Batch Reactor is the one that consumes the most energy and that aluminium polychloride is the reagent with the greatest environmental impact. Due to this, strategies to improve the efficiency and reduce the environmental impact of the treatment system were discussed, such as optimising the use of reagents and evaluating alternative energy sources. The results obtained underline the complexity of environmental management in wastewater treatment and highlight the necessity for innovative approaches to face these challenges, providing a global vision that can guide future research and practice in the field of wastewater treatment.
Direction
MOREIRA VILAR, MARIA TERESA (Tutorships)
Arias Calvo, Ana (Co-tutorships)
MOREIRA VILAR, MARIA TERESA (Tutorships)
Arias Calvo, Ana (Co-tutorships)
Court
PRIETO LAMAS, BEATRIZ LORETO (Chairman)
BELLO BUGALLO, PASTORA MARIA (Secretary)
PARADELO NUÑEZ, REMIGIO (Member)
PRIETO LAMAS, BEATRIZ LORETO (Chairman)
BELLO BUGALLO, PASTORA MARIA (Secretary)
PARADELO NUÑEZ, REMIGIO (Member)
Eating with awareness: the atlantic diet and its contribution to a water-respectful society
Authorship
E.P.R.
Master in Environmental Engineering (3rd ed)
E.P.R.
Master in Environmental Engineering (3rd ed)
Defense date
09.09.2024 10:30
09.09.2024 10:30
Summary
Contemporary food systems have advanced technologically to meet the needs of a growing population, but they face significant sustainability challenges. These systems, which encompass the production and supply of food, have a negative impact into the environment due to the excessive use of natural resources such as water, land and energy, causing issues like resource depletion, greenhouse gas emissions and loss of biodiversity. Agriculture is responsible for 70% of water withdrawals, and to carry out a proper water management its essential to thoroughly know how much water is needed to produce a product. Specially, the production of meat and dairy products have a high environmental burden, and their demand is expected to increase by 70% by 2050. Therefore, the study of diets, such as the mediterranean or the new Atlantic diet, are necessary for a healthy and environmentally sustainable lifestyle. This Works focuses on creating a database that documents the water footprint of some foodstuff, with the aim of providing information for the creation of diets that are both healthy and sustainable, specially regarding to our water resources, in order to use then in a case of study focused on creating a weekly menu, from Monday to Friday, based on the Atlantic diet. This data has been created following the methodology of the ‘Water Footprint Assessment Manual’, which is a guide detailing the method for assessing and managing the water footprint of products, companies, and countries, developed by the Water Footprint Network. As conclusions, we find that the water footprint of a weekly menu base on the Atlantic diet is equivalent to 12766,5 liters of water per week (from Monday to Friday) per person, including the three components of the water footprint (green, blue, and gray), resulting in 2553,3 liter per person per day. Mostly, the largest contribution to the water footprint of the menu comes from the green water footprint (approximately 57%), with dairy products such as milk (75,35%), yogurt (77,9%) and tenderloin (77,82%) standing out
Contemporary food systems have advanced technologically to meet the needs of a growing population, but they face significant sustainability challenges. These systems, which encompass the production and supply of food, have a negative impact into the environment due to the excessive use of natural resources such as water, land and energy, causing issues like resource depletion, greenhouse gas emissions and loss of biodiversity. Agriculture is responsible for 70% of water withdrawals, and to carry out a proper water management its essential to thoroughly know how much water is needed to produce a product. Specially, the production of meat and dairy products have a high environmental burden, and their demand is expected to increase by 70% by 2050. Therefore, the study of diets, such as the mediterranean or the new Atlantic diet, are necessary for a healthy and environmentally sustainable lifestyle. This Works focuses on creating a database that documents the water footprint of some foodstuff, with the aim of providing information for the creation of diets that are both healthy and sustainable, specially regarding to our water resources, in order to use then in a case of study focused on creating a weekly menu, from Monday to Friday, based on the Atlantic diet. This data has been created following the methodology of the ‘Water Footprint Assessment Manual’, which is a guide detailing the method for assessing and managing the water footprint of products, companies, and countries, developed by the Water Footprint Network. As conclusions, we find that the water footprint of a weekly menu base on the Atlantic diet is equivalent to 12766,5 liters of water per week (from Monday to Friday) per person, including the three components of the water footprint (green, blue, and gray), resulting in 2553,3 liter per person per day. Mostly, the largest contribution to the water footprint of the menu comes from the green water footprint (approximately 57%), with dairy products such as milk (75,35%), yogurt (77,9%) and tenderloin (77,82%) standing out
Direction
GONZALEZ GARCIA, SARA (Tutorships)
GONZALEZ GARCIA, SARA (Tutorships)
Court
MOREIRA VILAR, MARIA TERESA (Chairman)
Rojo Alboreca, Alberto (Secretary)
FIOL LOPEZ, SARAH (Member)
MOREIRA VILAR, MARIA TERESA (Chairman)
Rojo Alboreca, Alberto (Secretary)
FIOL LOPEZ, SARAH (Member)
Modelling, simulation and optimization of small-scale alkaline electrolizer in Aspen Plus
Authorship
G.M.L.
Master in Chemical and Bioprocess Engineering
G.M.L.
Master in Chemical and Bioprocess Engineering
Defense date
02.15.2024 12:30
02.15.2024 12:30
Summary
Green hydrogen stands out as one of the most promising energy carriers today, offering the potential to reduce greenhouse gas emissions. Among the most promising hydrogen production technologies, electrolysis takes a prominent position. In particular, alkaline electrolysis stands out as the most mature technology with high efficiency. To facilitate the implementation of these equipment in current industrial applications, a model of the electrolysis process was developed. Previous models found in the literature together with different own contributions which enhance the model were taken into account in model development. This model allows the calculation of the hydrogen and oxygen output streams of an electrolyzer regardless of operating conditions and equipment size relying on electrochemical and thermodynamic fundamentals. Building upon this model, a program was created to simulate an electrolyzer within the Aspen Plus environment. The model's responses to variations in different input variables were analyzed, revealing that the key factors affecting performance are current density, operating temperature, and electrolyte composition. Additionally, electrode area proves to be significantly relevant in certain responses. Using this model, an electrolyzer was optimized to produce 0,8 kg/h of H2, defining nominal operating conditions and the main equipment dimensions.
Green hydrogen stands out as one of the most promising energy carriers today, offering the potential to reduce greenhouse gas emissions. Among the most promising hydrogen production technologies, electrolysis takes a prominent position. In particular, alkaline electrolysis stands out as the most mature technology with high efficiency. To facilitate the implementation of these equipment in current industrial applications, a model of the electrolysis process was developed. Previous models found in the literature together with different own contributions which enhance the model were taken into account in model development. This model allows the calculation of the hydrogen and oxygen output streams of an electrolyzer regardless of operating conditions and equipment size relying on electrochemical and thermodynamic fundamentals. Building upon this model, a program was created to simulate an electrolyzer within the Aspen Plus environment. The model's responses to variations in different input variables were analyzed, revealing that the key factors affecting performance are current density, operating temperature, and electrolyte composition. Additionally, electrode area proves to be significantly relevant in certain responses. Using this model, an electrolyzer was optimized to produce 0,8 kg/h of H2, defining nominal operating conditions and the main equipment dimensions.
Direction
ROCA BORDELLO, ENRIQUE (Tutorships)
FERNANDEZ CARRASCO, EUGENIO (Co-tutorships)
Blanco López, Alexander (Co-tutorships)
ROCA BORDELLO, ENRIQUE (Tutorships)
FERNANDEZ CARRASCO, EUGENIO (Co-tutorships)
Blanco López, Alexander (Co-tutorships)
Court
VIDAL TATO, MARIA ISABEL (Chairman)
ROMERO CASTRO, NOELIA MARIA (Secretary)
RODRIGUEZ MARTINEZ, HECTOR (Member)
VIDAL TATO, MARIA ISABEL (Chairman)
ROMERO CASTRO, NOELIA MARIA (Secretary)
RODRIGUEZ MARTINEZ, HECTOR (Member)
Basketball score prediction by analyzing key variables in basket shots to evaluate the performance of a player/team.
Authorship
O.F.L.
Master in Masive Data Analisys Tecnologies: Big Data
O.F.L.
Master in Masive Data Analisys Tecnologies: Big Data
Defense date
07.15.2024 16:00
07.15.2024 16:00
Summary
This work presents a new module intended to enrich the application used by Monbus Obradoiro (Santiago de Compostela basketball team and participant in the 2023/2024 season of the Liga Endesa ACB) to analyze the performance of its players and the rival teams. The project was developed in collaboration with the mentioned basketball team and the company DXC Technology. The proposed module consists of a points prediction model for a game based on the characteristics of the shots taken during the match, such that the main application of the model is to compare the actual performance with the expected performance. The project takes as inspiration and reference the expected goals (xG) metric used in soccer, and it is an approximation of extrapolating this metric to another sport (basketball).
This work presents a new module intended to enrich the application used by Monbus Obradoiro (Santiago de Compostela basketball team and participant in the 2023/2024 season of the Liga Endesa ACB) to analyze the performance of its players and the rival teams. The project was developed in collaboration with the mentioned basketball team and the company DXC Technology. The proposed module consists of a points prediction model for a game based on the characteristics of the shots taken during the match, such that the main application of the model is to compare the actual performance with the expected performance. The project takes as inspiration and reference the expected goals (xG) metric used in soccer, and it is an approximation of extrapolating this metric to another sport (basketball).
Direction
Losada Carril, David Enrique (Tutorships)
Sanz Anchelergues, Adolfo (Co-tutorships)
Losada Carril, David Enrique (Tutorships)
Sanz Anchelergues, Adolfo (Co-tutorships)
Court
MUCIENTES MOLINA, MANUEL FELIPE (Chairman)
López Martínez, Paula (Secretary)
RIOS VIQUEIRA, JOSE RAMON (Member)
MUCIENTES MOLINA, MANUEL FELIPE (Chairman)
López Martínez, Paula (Secretary)
RIOS VIQUEIRA, JOSE RAMON (Member)
Microplastic behaviour en the trophic chain: A contribución to its management through modelling
Authorship
A.P.O.
Master in Environmental Engineering (2ª ed)
A.P.O.
Master in Environmental Engineering (2ª ed)
Defense date
03.01.2024 12:30
03.01.2024 12:30
Summary
The presence of plastics in the environment and its adverse effects on biological life has long been the focus of many environmental research studies so much so that there has been an exponential increase in this area in the last two decades. Most of these studies centre on the oceans, seas and rivers as the hot spot of plastic pollution leaving out many transition zones like the estuaries, lakes, basins, etc which are also hot spots of plastic pollution. Once in the environment, plastics pass through many processes like photo-oxidation, wind abrasion, etc, and also due to prolong usage, end up fragmenting into tiny little pieces that ultimately result in sizes that are less than 5mm, becoming microplastics. In this research study, the focus has been made on an estuary, precisely Ria de Vigo, and more importantly, on the adverse effect of microplastic pollution. This study analyses the toxic effects microplastics pose in the trophic chain especially among aquatic species and also confirms microplastics trophic transfer among aquatic species. A model has been constructed with which the microplastic concentration in three different matrixes; in sediments, surface water and in mussels, can be made. The model also serves as an important tool in predicting the microplastic concentration in the three matrixes on a medium and long term basis as well as being useful in the formulation of environmental policies.
The presence of plastics in the environment and its adverse effects on biological life has long been the focus of many environmental research studies so much so that there has been an exponential increase in this area in the last two decades. Most of these studies centre on the oceans, seas and rivers as the hot spot of plastic pollution leaving out many transition zones like the estuaries, lakes, basins, etc which are also hot spots of plastic pollution. Once in the environment, plastics pass through many processes like photo-oxidation, wind abrasion, etc, and also due to prolong usage, end up fragmenting into tiny little pieces that ultimately result in sizes that are less than 5mm, becoming microplastics. In this research study, the focus has been made on an estuary, precisely Ria de Vigo, and more importantly, on the adverse effect of microplastic pollution. This study analyses the toxic effects microplastics pose in the trophic chain especially among aquatic species and also confirms microplastics trophic transfer among aquatic species. A model has been constructed with which the microplastic concentration in three different matrixes; in sediments, surface water and in mussels, can be made. The model also serves as an important tool in predicting the microplastic concentration in the three matrixes on a medium and long term basis as well as being useful in the formulation of environmental policies.
Direction
BELLO BUGALLO, PASTORA MARIA (Tutorships)
BELLO BUGALLO, PASTORA MARIA (Tutorships)
Court
FIOL LOPEZ, SARAH (Chairman)
GONZALEZ GARCIA, SARA (Secretary)
CARBALLA ARCOS, MARTA (Member)
FIOL LOPEZ, SARAH (Chairman)
GONZALEZ GARCIA, SARA (Secretary)
CARBALLA ARCOS, MARTA (Member)
Evaluation of the SYCLomatic tool
Authorship
T.L.B.
Máster Universitario en Computación de Altas Prestaciones / High Performance Computing from the University of A Coruña and the University of Santiago de Compostela
T.L.B.
Máster Universitario en Computación de Altas Prestaciones / High Performance Computing from the University of A Coruña and the University of Santiago de Compostela
Defense date
09.03.2024 10:00
09.03.2024 10:00
Summary
CUDA is a parallel computing platform and programming model exclusive to NVIDIA GPUs. This exclusivity limits the portability of code to other platforms. In contrast, SYCL is a C ++ based language that supports different architectures and disjoint memory models, allowing the execution with the same code on CPUs, GPUs, and FPGAs. The SYCLomatic tool emerges in this context as an open-source solution to improve the transition of CUDA code to SYCL. This simplifies the process of code reutilization for multiple platforms, reducing time and costs in ongoing code maintenance. The automatic code migration with SYCLomatic achieves between 90-95% automation, leaving the remaining necessary manual coding to the developers, as well as the performance tuning for a specific architecture. This Master’s thesis presents a usability guide for the SYCLomatic tool, where the degree of automation in the migration will be exploted, indicating all the necessary manual modifications. Additionally, a comparative performance study of the migrated and original codes is conducted
CUDA is a parallel computing platform and programming model exclusive to NVIDIA GPUs. This exclusivity limits the portability of code to other platforms. In contrast, SYCL is a C ++ based language that supports different architectures and disjoint memory models, allowing the execution with the same code on CPUs, GPUs, and FPGAs. The SYCLomatic tool emerges in this context as an open-source solution to improve the transition of CUDA code to SYCL. This simplifies the process of code reutilization for multiple platforms, reducing time and costs in ongoing code maintenance. The automatic code migration with SYCLomatic achieves between 90-95% automation, leaving the remaining necessary manual coding to the developers, as well as the performance tuning for a specific architecture. This Master’s thesis presents a usability guide for the SYCLomatic tool, where the degree of automation in the migration will be exploted, indicating all the necessary manual modifications. Additionally, a comparative performance study of the migrated and original codes is conducted
Direction
RODRIGUEZ ALCARAZ, SILVIA (Tutorships)
López Vilariño, David (Co-tutorships)
RODRIGUEZ ALCARAZ, SILVIA (Tutorships)
López Vilariño, David (Co-tutorships)
Court
CABALEIRO DOMINGUEZ, JOSE CARLOS (Chairman)
Blanco Heras, Dora (Secretary)
González Domínguez, Jorge (Member)
CABALEIRO DOMINGUEZ, JOSE CARLOS (Chairman)
Blanco Heras, Dora (Secretary)
González Domínguez, Jorge (Member)
Optimization of Extracellular Polymeric Substances (EPS) extraction for their valorization as copper biosorbent
Authorship
C.F.G.A.
Master in Chemical and Bioprocess Engineering
C.F.G.A.
Master in Chemical and Bioprocess Engineering
Defense date
07.09.2024 12:00
07.09.2024 12:00
Summary
Waste management, especially in wastewater treatment, is a growing challenge due to urban development and the release of heavy metals into aquatic environments caused by industrial activities, poses a threat to ecosystems and human health. Extracellular Polymeric Substances (EPS) have unique properties that make them effective adsorbents for heavy metals, making them an attractive alternative in the search for sustainable and economically viable solutions for metal removal. This thesis focused on optimizing a process for the efficient extraction of EPS from sludges of Wastewater Treatment Plants (WWTPs) to maximize the yield and their capacity for copper (Cu2+) adsorption, through experimental design as a statistical technique. The extraction method used was alkaline hydrolysis, varying four factors: reagent concentration, temperature, percentage by mass of sludge in the sludge-reagent mixture, and type of alkaline reagent. Although the optimal conditions were not achieved, it was identified that at 20 oC, 10 % sludge, and using 1 M NaOH, extraction yields and adsorption capacity of EPS were maximized (465 mg EPS/g SV and 46.5 mg Cu(II) ads/g EPS, respectively). Additionally, it was determined that reagent concentration and the ratio between sludge and reagent are the most influential variables in the extraction yield and adsorption capacity. Characterization of extracts revealed a higher presence of proteins than polysaccharides, especially in trials with high concentrations of alkaline reagents. Unlike what is published in the literature, apparently increasing the temperature in the extraction does not increase the amount of EPS; however, what is observed in this study is the result of two effects, one of them being the degradation of macromolecules caused by temperature and the other due to the loss of molecular components of lower weight during the dialysis stages prior to adsorption tests. Better adsorption capacities were observed using Na2CO3 instead of NaOH as the alkaline reagent, as the lower alkalinity allowed better maintenance of the structural integrity of the EPS, thus maintaining the activity of the functional groups on its surface, favoring the adsorption of metal ions. On the other hand, Z potential assays indicated that electrostatic interaction does not fully explain the behavior of EPS in adsorption processes, thus requiring further investigation of the mechanisms involved in this process. The main limitation of this study was the first-order regression model used in the experimental design, which did not allow for complete optimization. It is recommended to carry out an experimental design with a second-order regression model, with the Central Composite Design or the Box-Behnken Design being suitable for this purpose.
Waste management, especially in wastewater treatment, is a growing challenge due to urban development and the release of heavy metals into aquatic environments caused by industrial activities, poses a threat to ecosystems and human health. Extracellular Polymeric Substances (EPS) have unique properties that make them effective adsorbents for heavy metals, making them an attractive alternative in the search for sustainable and economically viable solutions for metal removal. This thesis focused on optimizing a process for the efficient extraction of EPS from sludges of Wastewater Treatment Plants (WWTPs) to maximize the yield and their capacity for copper (Cu2+) adsorption, through experimental design as a statistical technique. The extraction method used was alkaline hydrolysis, varying four factors: reagent concentration, temperature, percentage by mass of sludge in the sludge-reagent mixture, and type of alkaline reagent. Although the optimal conditions were not achieved, it was identified that at 20 oC, 10 % sludge, and using 1 M NaOH, extraction yields and adsorption capacity of EPS were maximized (465 mg EPS/g SV and 46.5 mg Cu(II) ads/g EPS, respectively). Additionally, it was determined that reagent concentration and the ratio between sludge and reagent are the most influential variables in the extraction yield and adsorption capacity. Characterization of extracts revealed a higher presence of proteins than polysaccharides, especially in trials with high concentrations of alkaline reagents. Unlike what is published in the literature, apparently increasing the temperature in the extraction does not increase the amount of EPS; however, what is observed in this study is the result of two effects, one of them being the degradation of macromolecules caused by temperature and the other due to the loss of molecular components of lower weight during the dialysis stages prior to adsorption tests. Better adsorption capacities were observed using Na2CO3 instead of NaOH as the alkaline reagent, as the lower alkalinity allowed better maintenance of the structural integrity of the EPS, thus maintaining the activity of the functional groups on its surface, favoring the adsorption of metal ions. On the other hand, Z potential assays indicated that electrostatic interaction does not fully explain the behavior of EPS in adsorption processes, thus requiring further investigation of the mechanisms involved in this process. The main limitation of this study was the first-order regression model used in the experimental design, which did not allow for complete optimization. It is recommended to carry out an experimental design with a second-order regression model, with the Central Composite Design or the Box-Behnken Design being suitable for this purpose.
Direction
MOSQUERA CORRAL, ANUSKA (Tutorships)
Marzialetti , Teresita (Co-tutorships)
MOSQUERA CORRAL, ANUSKA (Tutorships)
Marzialetti , Teresita (Co-tutorships)
Court
FEIJOO COSTA, GUMERSINDO (Chairman)
DIAZ JULLIEN, CRISTINA (Secretary)
FERNANDEZ CARRASCO, EUGENIO (Member)
FEIJOO COSTA, GUMERSINDO (Chairman)
DIAZ JULLIEN, CRISTINA (Secretary)
FERNANDEZ CARRASCO, EUGENIO (Member)
Cross-language emotion recognition from speech
Authorship
M.J.F.
Master in artificial intelligence
M.J.F.
Master in artificial intelligence
Defense date
02.21.2024 13:30
02.21.2024 13:30
Summary
Speech emotion recognition is an important part of affective computing that focuses on the classification of human emotions through speech audio. Recognizing emotions from speech has multiple applications, such as determining a person’s physical and psychological state of well-being. In this work, we performed cross-language emotion classification on five corpora: the Berlin Database of Emotional Speech, the Ryerson Audio-Visual Database of Emotional Speech and Song, the speech corpus of Quechua Collao for automatic emotion recognition, the EmoFilm multilingual emotional speech corpus, and the Italian emotional speech database Emovo corpus. Pre-processing of the audio files and feature extraction of attributes such as Mel spectrograms, Log-Mel spectrograms, and Mel-Frequency Cepstral Coefficients are performed. Several proposed models based on state-of-the-art architectures, such as one- and two-dimensional convolutional networks with and without long-short term memory layers, are evaluated, and data augmentation techniques are used to increase the availability of samples for these deep learning architectures. We report the results obtained on the five datasets mentioned above and their cross-language evaluation. Moreover, an analysis of the achieved results is performed to provide comparisons between Indo-European Romance and Germanic branches and Andean-Equatorial language families.
Speech emotion recognition is an important part of affective computing that focuses on the classification of human emotions through speech audio. Recognizing emotions from speech has multiple applications, such as determining a person’s physical and psychological state of well-being. In this work, we performed cross-language emotion classification on five corpora: the Berlin Database of Emotional Speech, the Ryerson Audio-Visual Database of Emotional Speech and Song, the speech corpus of Quechua Collao for automatic emotion recognition, the EmoFilm multilingual emotional speech corpus, and the Italian emotional speech database Emovo corpus. Pre-processing of the audio files and feature extraction of attributes such as Mel spectrograms, Log-Mel spectrograms, and Mel-Frequency Cepstral Coefficients are performed. Several proposed models based on state-of-the-art architectures, such as one- and two-dimensional convolutional networks with and without long-short term memory layers, are evaluated, and data augmentation techniques are used to increase the availability of samples for these deep learning architectures. We report the results obtained on the five datasets mentioned above and their cross-language evaluation. Moreover, an analysis of the achieved results is performed to provide comparisons between Indo-European Romance and Germanic branches and Andean-Equatorial language families.
Direction
CONDORI FERNANDEZ, NELLY (Tutorships)
CATALA BOLOS, ALEJANDRO (Co-tutorships)
CONDORI FERNANDEZ, NELLY (Tutorships)
CATALA BOLOS, ALEJANDRO (Co-tutorships)
Court
Barro Ameneiro, Senén (Chairman)
ALONSO MORAL, JOSE MARIA (Secretary)
Cotos Yáñez, José Manuel (Member)
Barro Ameneiro, Senén (Chairman)
ALONSO MORAL, JOSE MARIA (Secretary)
Cotos Yáñez, José Manuel (Member)
AI-Based Tracking Algorithm Selection for Onboard Detect-and-Avoid Systems
Authorship
J.B.G.T.
Master in artificial intelligence
J.B.G.T.
Master in artificial intelligence
Defense date
02.21.2024 16:30
02.21.2024 16:30
Summary
Aircraft detection is a complex problem with a multitude of applications in the field of unmanned aerial vehicles (UAVs), including detect-and-avoid systems. These systems use object trackers to detect surrounding traffic and take appropriate action for safe flight. This work focuses on investigating the state of the art of object tracking algorithms. From this research, SORT, DeepSORT, StrongSORT, ByteTrack, Tracktor and QDTrack are taken to be tested both in a laboratory environment, with MOT Challenge 2020 dataset; and in a field environment, with Anti-UAV dataset. From this experiments, it has been found that the best performer has been ByteTrack, achieving a MOTA of 93.498, an IDF1 of 89.766 and an HOTA of 77.236 with YOLOX detector over MOT20, and a MOTA of 71.979, an IDF1 of 65.693 and an HOTA of 54.184 with a RetinaNet detector over Anti-UAV, in addition to a pefformance of 38.2 FPS.
Aircraft detection is a complex problem with a multitude of applications in the field of unmanned aerial vehicles (UAVs), including detect-and-avoid systems. These systems use object trackers to detect surrounding traffic and take appropriate action for safe flight. This work focuses on investigating the state of the art of object tracking algorithms. From this research, SORT, DeepSORT, StrongSORT, ByteTrack, Tracktor and QDTrack are taken to be tested both in a laboratory environment, with MOT Challenge 2020 dataset; and in a field environment, with Anti-UAV dataset. From this experiments, it has been found that the best performer has been ByteTrack, achieving a MOTA of 93.498, an IDF1 of 89.766 and an HOTA of 77.236 with YOLOX detector over MOT20, and a MOTA of 71.979, an IDF1 of 65.693 and an HOTA of 54.184 with a RetinaNet detector over Anti-UAV, in addition to a pefformance of 38.2 FPS.
Direction
CORES COSTA, DANIEL (Tutorships)
Lorenzo Rodríguez, Alfonso (Co-tutorships)
CORES COSTA, DANIEL (Tutorships)
Lorenzo Rodríguez, Alfonso (Co-tutorships)
Court
CARREIRA NOUCHE, MARIA JOSE (Chairman)
CATALA BOLOS, ALEJANDRO (Secretary)
López Martínez, Paula (Member)
CARREIRA NOUCHE, MARIA JOSE (Chairman)
CATALA BOLOS, ALEJANDRO (Secretary)
López Martínez, Paula (Member)
Computer vision and AI techniques applied to layer height calculation in top-down bioprinters
Authorship
A.G.S.
Master in artificial intelligence
A.G.S.
Master in artificial intelligence
Defense date
02.21.2024 17:00
02.21.2024 17:00
Summary
Bioprinting is a novel technique that enables the biofabrication of artificial tissues, through the generation of a support structure in which live cells are trapped at a microscopic level, constituting what is known as a biochip. Biochips can be applied in various fields, ranging from the study of drug diffusion and tumor cells to crafting 3D artificial tissues, and even the creation of complete organs. This work focuses on solving one of the fundamental problems of multi-material DLP bioprinting: determining the gap between the model, printed on a platform, and the surface of the photopolymer resin or biogel, which defines the layer thickness of the printed object. A method based on AI and computer vision techniques is proposed, which is capable of determining the liquid height with a precision of 0.092 mm, and provides platform position measurements with a correlation coefficient of 0.994 with respect to the actual position defined by the stepper motor. This method enables multi-material printing without the need for human intervention whenever a change of material is performed.
Bioprinting is a novel technique that enables the biofabrication of artificial tissues, through the generation of a support structure in which live cells are trapped at a microscopic level, constituting what is known as a biochip. Biochips can be applied in various fields, ranging from the study of drug diffusion and tumor cells to crafting 3D artificial tissues, and even the creation of complete organs. This work focuses on solving one of the fundamental problems of multi-material DLP bioprinting: determining the gap between the model, printed on a platform, and the surface of the photopolymer resin or biogel, which defines the layer thickness of the printed object. A method based on AI and computer vision techniques is proposed, which is capable of determining the liquid height with a precision of 0.092 mm, and provides platform position measurements with a correlation coefficient of 0.994 with respect to the actual position defined by the stepper motor. This method enables multi-material printing without the need for human intervention whenever a change of material is performed.
Direction
FLORES GONZALEZ, JULIAN CARLOS (Tutorships)
NIETO GARCIA, DANIEL (Co-tutorships)
FLORES GONZALEZ, JULIAN CARLOS (Tutorships)
NIETO GARCIA, DANIEL (Co-tutorships)
Court
CARREIRA NOUCHE, MARIA JOSE (Chairman)
CATALA BOLOS, ALEJANDRO (Secretary)
López Martínez, Paula (Member)
CARREIRA NOUCHE, MARIA JOSE (Chairman)
CATALA BOLOS, ALEJANDRO (Secretary)
López Martínez, Paula (Member)
Development of a Multi-Document Question-Answering System for the Creation of a FAQ Assistant
Authorship
M.M.H.L.
Master in artificial intelligence
M.M.H.L.
Master in artificial intelligence
Defense date
02.21.2024 13:00
02.21.2024 13:00
Summary
This Master's thesis delves into the development of a Multi-Document Question-Answering System for a FAQ Assistant, leveraging advanced techniques in Artificial Intelligence and Natural Language Processing. A key focus of this research is the utilization of semantic search with vector representations of passages prior to the user query, which has emerged as a significant finding in reducing the workload of the reranker while maintaining result accuracy by examining fewer documents in the process. The thesis provides an in-depth exploration of Transformer architecture and semantic search methods, aiming to enhance information retrieval and question-answering capabilities within QA systems. By emphasizing the efficiency gained through semantic search with vector representations, this work sheds light on a novel approach to document ranking and information retrieval, ultimately streamlining the QA process and improving user experience. The study demonstrated a potential reduction in the workload of reranking models by transitioning from analyzing 1000 to 50 documents, delegating a pre-ranking task to a semantically-based model for vectorized documents. This allowed for a decrease in reranker execution time without compromising precision as measured by MRR@5. Finally, the strategies were tested with the complete process of selecting relevant documents using 50, 100, and 500 queries, resulting in a tuple of models: ms-marco-distilbert-base-dot-prod-v3 as the passage search model and ms-marco-MiniLM-L-12-v2 as the reranking model.
This Master's thesis delves into the development of a Multi-Document Question-Answering System for a FAQ Assistant, leveraging advanced techniques in Artificial Intelligence and Natural Language Processing. A key focus of this research is the utilization of semantic search with vector representations of passages prior to the user query, which has emerged as a significant finding in reducing the workload of the reranker while maintaining result accuracy by examining fewer documents in the process. The thesis provides an in-depth exploration of Transformer architecture and semantic search methods, aiming to enhance information retrieval and question-answering capabilities within QA systems. By emphasizing the efficiency gained through semantic search with vector representations, this work sheds light on a novel approach to document ranking and information retrieval, ultimately streamlining the QA process and improving user experience. The study demonstrated a potential reduction in the workload of reranking models by transitioning from analyzing 1000 to 50 documents, delegating a pre-ranking task to a semantically-based model for vectorized documents. This allowed for a decrease in reranker execution time without compromising precision as measured by MRR@5. Finally, the strategies were tested with the complete process of selecting relevant documents using 50, 100, and 500 queries, resulting in a tuple of models: ms-marco-distilbert-base-dot-prod-v3 as the passage search model and ms-marco-MiniLM-L-12-v2 as the reranking model.
Direction
CATALA BOLOS, ALEJANDRO (Tutorships)
CEREZO COSTAS, HÉCTOR (Co-tutorships)
CATALA BOLOS, ALEJANDRO (Tutorships)
CEREZO COSTAS, HÉCTOR (Co-tutorships)
Court
Barro Ameneiro, Senén (Chairman)
ALONSO MORAL, JOSE MARIA (Secretary)
Cotos Yáñez, José Manuel (Member)
Barro Ameneiro, Senén (Chairman)
ALONSO MORAL, JOSE MARIA (Secretary)
Cotos Yáñez, José Manuel (Member)
Experimentation on multilingual models trained using transformer architectures for natural language treatment
Authorship
A.L.V.
Master in artificial intelligence
A.L.V.
Master in artificial intelligence
Defense date
07.19.2024 10:00
07.19.2024 10:00
Summary
The focus of this Master of Science thesis is to experiment with transfer learning methods on multilingual models that use the transformer architecture and work for the Galician language; mainly the adaptation of these models to downstream natural language tasks such as instruction following. We will use an open-source Galician model, Carballo-1.3B, to perform regular fine-tuning and LoRA methods on it; the end goal being obtaining a working model and comparing these two methods in how they adapt the model to learn the new task while: (i) not losing capabilities on text generation, and (ii) not requiring an unreasonable amount of computational power. To this end, two different human evaluations will be conducted on the trained models, where we judge the model on instruction-following and text generation.
The focus of this Master of Science thesis is to experiment with transfer learning methods on multilingual models that use the transformer architecture and work for the Galician language; mainly the adaptation of these models to downstream natural language tasks such as instruction following. We will use an open-source Galician model, Carballo-1.3B, to perform regular fine-tuning and LoRA methods on it; the end goal being obtaining a working model and comparing these two methods in how they adapt the model to learn the new task while: (i) not losing capabilities on text generation, and (ii) not requiring an unreasonable amount of computational power. To this end, two different human evaluations will be conducted on the trained models, where we judge the model on instruction-following and text generation.
Direction
Gamallo Otero, Pablo (Tutorships)
BUGARIN DIZ, ALBERTO JOSE (Co-tutorships)
de Dios Flores, Iria (Co-tutorships)
Gamallo Otero, Pablo (Tutorships)
BUGARIN DIZ, ALBERTO JOSE (Co-tutorships)
de Dios Flores, Iria (Co-tutorships)
Court
LAMA PENIN, MANUEL (Chairman)
CONDORI FERNANDEZ, NELLY (Secretary)
FELIX LAMAS, PAULO (Member)
LAMA PENIN, MANUEL (Chairman)
CONDORI FERNANDEZ, NELLY (Secretary)
FELIX LAMAS, PAULO (Member)
Developing an OCR+NER solution to extract information from invoices
Authorship
M.A.M.
Master in artificial intelligence
M.A.M.
Master in artificial intelligence
Defense date
02.20.2024 12:00
02.20.2024 12:00
Summary
Document understanding, leveraging technologies like Optical Character Recognition (OCR) and Named Entity Recognition (NER), stands as a cornerstone in modern information processing. This thesis project delves into the domain with a specific focus on information extraction from invoices. Arising from practical insights gained during the internship, the project aims to bridge theoretical knowledge with hands-on application. The project aims to develop a robust solution capable of efficiently extracting important information from diverse invoice formats. Drawing from the theoretical foundations of document understanding, including the nuances of OCR and NER, we outline a systematic approach encompassing data availability, methodology, and rigorous testing. This thesis project offers a comprehensive exploration of document understanding in the context of invoice processing, providing insights into technology integration, methodology development, and the potential for innovation in real-world scenarios. Through its systematic approach and dedication to excellence, the project contributes to advancing the field of document understanding, opening avenues for future research and application.
Document understanding, leveraging technologies like Optical Character Recognition (OCR) and Named Entity Recognition (NER), stands as a cornerstone in modern information processing. This thesis project delves into the domain with a specific focus on information extraction from invoices. Arising from practical insights gained during the internship, the project aims to bridge theoretical knowledge with hands-on application. The project aims to develop a robust solution capable of efficiently extracting important information from diverse invoice formats. Drawing from the theoretical foundations of document understanding, including the nuances of OCR and NER, we outline a systematic approach encompassing data availability, methodology, and rigorous testing. This thesis project offers a comprehensive exploration of document understanding in the context of invoice processing, providing insights into technology integration, methodology development, and the potential for innovation in real-world scenarios. Through its systematic approach and dedication to excellence, the project contributes to advancing the field of document understanding, opening avenues for future research and application.
Direction
CATALA BOLOS, ALEJANDRO (Tutorships)
Méndez Llatas, Pablo (Co-tutorships)
CATALA BOLOS, ALEJANDRO (Tutorships)
Méndez Llatas, Pablo (Co-tutorships)
Court
MUCIENTES MOLINA, MANUEL FELIPE (Chairman)
ORDOÑEZ IGLESIAS, ALVARO (Secretary)
Sánchez Vila, Eduardo Manuel (Member)
MUCIENTES MOLINA, MANUEL FELIPE (Chairman)
ORDOÑEZ IGLESIAS, ALVARO (Secretary)
Sánchez Vila, Eduardo Manuel (Member)
Research, development and evaluation of deep learning algorithms for various functionalities in wildfire management support
Authorship
D.S.R.
Master in artificial intelligence
D.S.R.
Master in artificial intelligence
Defense date
02.21.2024 17:30
02.21.2024 17:30
Summary
Early fire detection is a crucial task to fight fires and protect biodiversity, in which drones equipped with vision systems can be a key tool and has proven to be effective for early detection systems. In this project we present a methodology for the evaluation of deep learning state-of-the-art models in a two-stage approach experimentation. The first experiments are done using a public dataset, which is subsequently expanded to perform the second model experiments. Four different architectures are trained with different feature extractors, resulting in tests with eleven different models. For this reason, dataset is also created using a public dataset and adding more hand labeled images, thus gaining in variety of situations and characteristics of the images, bringing them closer to a real scenario for a drone. Finally, evaluation metrics are obtained and inference tests are performed with the chosen models, concluding with the best alternative present in the state of the art for this application. We have found that the best performer of the models tested was SSD-VGG-16, with a mean Average Precision of 0.76 with 0.5 IoU threshold (mAP@0.5), although most of the state-of-the-art models performed well in this scenario.
Early fire detection is a crucial task to fight fires and protect biodiversity, in which drones equipped with vision systems can be a key tool and has proven to be effective for early detection systems. In this project we present a methodology for the evaluation of deep learning state-of-the-art models in a two-stage approach experimentation. The first experiments are done using a public dataset, which is subsequently expanded to perform the second model experiments. Four different architectures are trained with different feature extractors, resulting in tests with eleven different models. For this reason, dataset is also created using a public dataset and adding more hand labeled images, thus gaining in variety of situations and characteristics of the images, bringing them closer to a real scenario for a drone. Finally, evaluation metrics are obtained and inference tests are performed with the chosen models, concluding with the best alternative present in the state of the art for this application. We have found that the best performer of the models tested was SSD-VGG-16, with a mean Average Precision of 0.76 with 0.5 IoU threshold (mAP@0.5), although most of the state-of-the-art models performed well in this scenario.
Direction
MUCIENTES MOLINA, MANUEL FELIPE (Tutorships)
Abado Bóveda, Silvia (Co-tutorships)
MUCIENTES MOLINA, MANUEL FELIPE (Tutorships)
Abado Bóveda, Silvia (Co-tutorships)
Court
CARREIRA NOUCHE, MARIA JOSE (Chairman)
CATALA BOLOS, ALEJANDRO (Secretary)
López Martínez, Paula (Member)
CARREIRA NOUCHE, MARIA JOSE (Chairman)
CATALA BOLOS, ALEJANDRO (Secretary)
López Martínez, Paula (Member)
A novel approach for remaining time prediction in Predictive Process Monitoring
Authorship
N.S.B.
Master in artificial intelligence
N.S.B.
Master in artificial intelligence
Defense date
02.20.2024 12:30
02.20.2024 12:30
Summary
Tranformer-based models have triggered a whole revolution in the field on Natural Language Processing, and now they are spreading out their tentacles to other fields, such as time-series analysis or process mining. Recently, a new transformer-based approach has been proposed in predictive process monitoring that outperforms all the recurrent-based models that held the best marks until the moment. In this work, we part from this architecture and propose modifications aimed at improving its results and understanding its functioning. We also experiment with attention to discover to which extent it contributes to the quality of predictions. Besides, we propose a novel interpretation of traces as a time series of events and borrow a new encoding scheme from a this field to leverage temporal information of events. This temporal encoding integrates the time between events in the representation obtained for the trace, similarly to positional encoding but paying attention to the separation between events in time. Results show that different kind of models will perform better depending on intrinsic characteristics of the event log at hand, and that attention may not be as relevant in process mining as it is in language modeling.
Tranformer-based models have triggered a whole revolution in the field on Natural Language Processing, and now they are spreading out their tentacles to other fields, such as time-series analysis or process mining. Recently, a new transformer-based approach has been proposed in predictive process monitoring that outperforms all the recurrent-based models that held the best marks until the moment. In this work, we part from this architecture and propose modifications aimed at improving its results and understanding its functioning. We also experiment with attention to discover to which extent it contributes to the quality of predictions. Besides, we propose a novel interpretation of traces as a time series of events and borrow a new encoding scheme from a this field to leverage temporal information of events. This temporal encoding integrates the time between events in the representation obtained for the trace, similarly to positional encoding but paying attention to the separation between events in time. Results show that different kind of models will perform better depending on intrinsic characteristics of the event log at hand, and that attention may not be as relevant in process mining as it is in language modeling.
Direction
LAMA PENIN, MANUEL (Tutorships)
VIDAL AGUIAR, JUAN CARLOS (Co-tutorships)
LAMA PENIN, MANUEL (Tutorships)
VIDAL AGUIAR, JUAN CARLOS (Co-tutorships)
Court
MUCIENTES MOLINA, MANUEL FELIPE (Chairman)
ORDOÑEZ IGLESIAS, ALVARO (Secretary)
Sánchez Vila, Eduardo Manuel (Member)
MUCIENTES MOLINA, MANUEL FELIPE (Chairman)
ORDOÑEZ IGLESIAS, ALVARO (Secretary)
Sánchez Vila, Eduardo Manuel (Member)
3D Object shape reconstruction combining a single depth image and visual tactile sensors
Authorship
M.B.S.
University Master in Computer Vision
M.B.S.
University Master in Computer Vision
Defense date
02.06.2024 09:30
02.06.2024 09:30
Summary
Full knowledge of an object’s 3D shape is crucial for robotic manipulation tasks. However, obtaining this information in uncontrolled environments can be challenging. Additionally, robot interaction often obscures vision. High-resolution visuotactile sensors can provide dense local information about the object’s shape. Nonetheless, working with visuo-tactile sensors has inherent limitations due to their small sensing area. Therefore, in some cases, the shape representation must be approximated without visual inspection. This work implements a methodology to achieve high-fidelity object shape reconstruction by extensively using a single RGB-D image view. The methodology estimates the symmetry axis plane for reflection and then adds different tactile measurements of the non-visual part using visuo-tactile sensors. The information from the depth camera and visuo-tactile sensors is processed and combined objectively to obtain the most accurate representation of the object’s shape. The 3D mesh is then reconstructed using traditional methods, such as ball pivoting and Poisson surface reconstruction algorithms.
Full knowledge of an object’s 3D shape is crucial for robotic manipulation tasks. However, obtaining this information in uncontrolled environments can be challenging. Additionally, robot interaction often obscures vision. High-resolution visuotactile sensors can provide dense local information about the object’s shape. Nonetheless, working with visuo-tactile sensors has inherent limitations due to their small sensing area. Therefore, in some cases, the shape representation must be approximated without visual inspection. This work implements a methodology to achieve high-fidelity object shape reconstruction by extensively using a single RGB-D image view. The methodology estimates the symmetry axis plane for reflection and then adds different tactile measurements of the non-visual part using visuo-tactile sensors. The information from the depth camera and visuo-tactile sensors is processed and combined objectively to obtain the most accurate representation of the object’s shape. The 3D mesh is then reconstructed using traditional methods, such as ball pivoting and Poisson surface reconstruction algorithms.
Direction
CARREIRA NOUCHE, MARIA JOSE (Tutorships)
CORRALES RAMON, JUAN ANTONIO (Co-tutorships)
CARREIRA NOUCHE, MARIA JOSE (Tutorships)
CORRALES RAMON, JUAN ANTONIO (Co-tutorships)
Court
GARCIA TAHOCES, PABLO (Chairman)
López Martínez, Paula (Secretary)
Pardo López, Xosé Manuel (Member)
GARCIA TAHOCES, PABLO (Chairman)
López Martínez, Paula (Secretary)
Pardo López, Xosé Manuel (Member)
Study on the production and modification of cellulose nanoparticles for application in electrochemical cell materials
Authorship
P.J.P.
Master in Environmental Engineering (3rd ed)
P.J.P.
Master in Environmental Engineering (3rd ed)
Defense date
09.09.2024 12:00
09.09.2024 12:00
Summary
Energy demand is growing at a dizzying pace, becoming increasingly difficult to satisfy. Currently, fossil fuels are the major sources of energy supply, but due to their multiple drawbacks and disadvantages, there is a growing interest in the search for other energy sources, particularly energy storage materials. These are becoming increasingly relevant in the industry, but many of the elements necessary for their development are classified as critical raw materials. Therefore, the search for sustainable materials for the development of electrochemical cells is a topic of great importance in the research world, such as lignocellulosic materials, which emerge as a promising material, especially cellulose, characterized by their highly interesting characteristics. In the present work, various mechanical and/or chemical pretreatments have been studied to optimize the process of obtaining micro and nanocellulose through high-pressure homogenization. In addition, esterification and phosphorylation reactions of nanocellulose have been carried out to functionalize it for its application in batteries.
Energy demand is growing at a dizzying pace, becoming increasingly difficult to satisfy. Currently, fossil fuels are the major sources of energy supply, but due to their multiple drawbacks and disadvantages, there is a growing interest in the search for other energy sources, particularly energy storage materials. These are becoming increasingly relevant in the industry, but many of the elements necessary for their development are classified as critical raw materials. Therefore, the search for sustainable materials for the development of electrochemical cells is a topic of great importance in the research world, such as lignocellulosic materials, which emerge as a promising material, especially cellulose, characterized by their highly interesting characteristics. In the present work, various mechanical and/or chemical pretreatments have been studied to optimize the process of obtaining micro and nanocellulose through high-pressure homogenization. In addition, esterification and phosphorylation reactions of nanocellulose have been carried out to functionalize it for its application in batteries.
Direction
PARAJO MONTES, MARIA MERCEDES (Tutorships)
Noguerol Cal, Rosalia (Co-tutorships)
PARAJO MONTES, MARIA MERCEDES (Tutorships)
Noguerol Cal, Rosalia (Co-tutorships)
Court
MOSQUERA CORRAL, ANUSKA (Chairman)
MONTERROSO MARTINEZ, MARIA DEL CARMEN (Secretary)
BARCIELA ALONSO, Ma CARMEN (Member)
MOSQUERA CORRAL, ANUSKA (Chairman)
MONTERROSO MARTINEZ, MARIA DEL CARMEN (Secretary)
BARCIELA ALONSO, Ma CARMEN (Member)
Addressing Multiple Object Tracking with Segmentation Masks
Authorship
M.B.G.
University Master in Computer Vision
M.B.G.
University Master in Computer Vision
Defense date
02.06.2024 10:05
02.06.2024 10:05
Summary
Multiple Object Tracking (MOT) aims to locate all the objects from a video, assigning them the same identities across all frames. Traditionally, this problem was addressed following the Tracking by Detection (TbD) paradigm, using detections represented by bounding boxes. However, bounding boxes can contain information from several objects, something that does not happen with segmentation masks. This work takes the ByteTrack MOT system as a starting point. Our proposal, ByteTrackMask, integrates a class-agnostic segmentation method and a segmentation-based tracker in ByteTrack in order to rescue tracks that would have been lost. Results over validation sets of MOT challenge datasets provide improvements in MOT metrics of interest like MOTA, IDF1 and false negatives.
Multiple Object Tracking (MOT) aims to locate all the objects from a video, assigning them the same identities across all frames. Traditionally, this problem was addressed following the Tracking by Detection (TbD) paradigm, using detections represented by bounding boxes. However, bounding boxes can contain information from several objects, something that does not happen with segmentation masks. This work takes the ByteTrack MOT system as a starting point. Our proposal, ByteTrackMask, integrates a class-agnostic segmentation method and a segmentation-based tracker in ByteTrack in order to rescue tracks that would have been lost. Results over validation sets of MOT challenge datasets provide improvements in MOT metrics of interest like MOTA, IDF1 and false negatives.
Direction
MUCIENTES MOLINA, MANUEL FELIPE (Tutorships)
BREA SANCHEZ, VICTOR MANUEL (Co-tutorships)
MUCIENTES MOLINA, MANUEL FELIPE (Tutorships)
BREA SANCHEZ, VICTOR MANUEL (Co-tutorships)
Court
GARCIA TAHOCES, PABLO (Chairman)
López Martínez, Paula (Secretary)
Pardo López, Xosé Manuel (Member)
GARCIA TAHOCES, PABLO (Chairman)
López Martínez, Paula (Secretary)
Pardo López, Xosé Manuel (Member)
Identification of environmental benefits linked to the application of vermicompost in albariño grape vineyards
Authorship
P.A.L.
Master in Environmental Engineering (3rd ed)
P.A.L.
Master in Environmental Engineering (3rd ed)
Defense date
09.09.2024 09:15
09.09.2024 09:15
Summary
The northwest of Spain, with its differentiated edaphic and climatic characteristics, gives rise to unique grape varieties, among which albariño stands out. The cultivation of vines is recognized as harmful to the environment, mainly due to the high use of agrochemicals, contributing significant amounts of toxic compounds emissions to the environment. The objective of improving the environmental sustainability of vine cultivation is to work on the integration of agricultural practices that are sustainable and, in this sense, it receives special attention or the use of biostimulants. This work focuses on analyzing the environmental benefits derived from the use of vermicompost (as an alternative to mineral fertilizers) produced from Albariño grape bagasse within a circular economy perspective in sustainable viticulture, considering as a case study the vineyards of Adega Paco y Lola. The study was carried out using the Life Cycle Analysis (LCA) methodology. The study aims to i) identify the critical points from an environmental perspective of the cultivation process, ii) determine the environmental benefits of the use of vermicompost as an organic amendment in front of the use of mineral fertilization. For this purpose, inventory data were collected from the cultivation process and processed to subsequently model the environmental profile based on a selection of impact categories, representative for agricultural systems, using SimaPro software to implement two inventory data.
The northwest of Spain, with its differentiated edaphic and climatic characteristics, gives rise to unique grape varieties, among which albariño stands out. The cultivation of vines is recognized as harmful to the environment, mainly due to the high use of agrochemicals, contributing significant amounts of toxic compounds emissions to the environment. The objective of improving the environmental sustainability of vine cultivation is to work on the integration of agricultural practices that are sustainable and, in this sense, it receives special attention or the use of biostimulants. This work focuses on analyzing the environmental benefits derived from the use of vermicompost (as an alternative to mineral fertilizers) produced from Albariño grape bagasse within a circular economy perspective in sustainable viticulture, considering as a case study the vineyards of Adega Paco y Lola. The study was carried out using the Life Cycle Analysis (LCA) methodology. The study aims to i) identify the critical points from an environmental perspective of the cultivation process, ii) determine the environmental benefits of the use of vermicompost as an organic amendment in front of the use of mineral fertilization. For this purpose, inventory data were collected from the cultivation process and processed to subsequently model the environmental profile based on a selection of impact categories, representative for agricultural systems, using SimaPro software to implement two inventory data.
Direction
GONZALEZ GARCIA, SARA (Tutorships)
GONZALEZ GARCIA, SARA (Tutorships)
Court
OTERO PEREZ, XOSE LOIS (Chairman)
SOUTO GONZALEZ, JOSE ANTONIO (Secretary)
ALDREY VAZQUEZ, JOSE ANTONIO (Member)
OTERO PEREZ, XOSE LOIS (Chairman)
SOUTO GONZALEZ, JOSE ANTONIO (Secretary)
ALDREY VAZQUEZ, JOSE ANTONIO (Member)
Production of PET Hydrolytic Enzymes in a Bioreactor
Authorship
J.L.G.
Master in Chemical and Bioprocess Engineering
J.L.G.
Master in Chemical and Bioprocess Engineering
Defense date
09.11.2024 13:15
09.11.2024 13:15
Summary
The accumulation of plastics in the environment is a critical problem due to their long degradation times compared to their useful life. Focusing on the ultimate goal of recycling single-use polymers, specifically PET compounds, this work centers around producing two hydrolytic enzymes for the polymer in a 2 L bioreactor. These enzymes are obtained both in free form and encapsulated in nano-spheres (NS), using the bacterium Escherichia coli BL21 (DE3) as the microorganism hosting the expression plasmids. The aim is to contribute to sustainable processes for large-scale production of these proteins, thereby creating new business opportunities related to enzymatic recycling. The study includes characterizing microbial growth curves (at a small scale), applying different bioreactor configurations (monitoring dissolved oxygen, pH, RedOx potential, absorbance), and comparing various cell lysis and enzyme purification techniques. The evaluation considers how these variables influence protein yield. In general, the choice of well type in small-scale assays affects microbial growth, and different bacterial reproduction rates are observed depending on the enzymes being produced. Additionally, within the bioreactor experiments, oxygen supply is a limiting factor that determines the production cycle time. Furthermore, future experiments are proposed to advance this development, drawing on the conclusions from this work to contribute the most relevant insights with the available information.
The accumulation of plastics in the environment is a critical problem due to their long degradation times compared to their useful life. Focusing on the ultimate goal of recycling single-use polymers, specifically PET compounds, this work centers around producing two hydrolytic enzymes for the polymer in a 2 L bioreactor. These enzymes are obtained both in free form and encapsulated in nano-spheres (NS), using the bacterium Escherichia coli BL21 (DE3) as the microorganism hosting the expression plasmids. The aim is to contribute to sustainable processes for large-scale production of these proteins, thereby creating new business opportunities related to enzymatic recycling. The study includes characterizing microbial growth curves (at a small scale), applying different bioreactor configurations (monitoring dissolved oxygen, pH, RedOx potential, absorbance), and comparing various cell lysis and enzyme purification techniques. The evaluation considers how these variables influence protein yield. In general, the choice of well type in small-scale assays affects microbial growth, and different bacterial reproduction rates are observed depending on the enzymes being produced. Additionally, within the bioreactor experiments, oxygen supply is a limiting factor that determines the production cycle time. Furthermore, future experiments are proposed to advance this development, drawing on the conclusions from this work to contribute the most relevant insights with the available information.
Direction
EIBES GONZALEZ, GEMMA MARIA (Tutorships)
Balboa Méndez, Sabela (Co-tutorships)
GALINDO MORALES, SARA (Co-tutorships)
EIBES GONZALEZ, GEMMA MARIA (Tutorships)
Balboa Méndez, Sabela (Co-tutorships)
GALINDO MORALES, SARA (Co-tutorships)
Court
GARRIDO FERNANDEZ, JUAN MANUEL (Chairman)
FRANCO RUIZ, DANIEL JOSE (Secretary)
FREIRE LEIRA, MARIA SONIA (Member)
GARRIDO FERNANDEZ, JUAN MANUEL (Chairman)
FRANCO RUIZ, DANIEL JOSE (Secretary)
FREIRE LEIRA, MARIA SONIA (Member)
Environmental assessment of the compliance of new biorefineries with the European Taxonomy Regulation.
Authorship
A.L.V.
Master in Environmental Engineering (2ª ed)
A.L.V.
Master in Environmental Engineering (2ª ed)
Defense date
09.09.2024 10:00
09.09.2024 10:00
Summary
This master thesis analyses how biorefineries, particularly carboxylate-based ones, contribute to sustainability by transforming biomass into valuable products such as biomethane and biofertilizers, aligning with the Europe 2020 Strategy and the European Taxonomy Regulation. It investigates the environmental performance indicators of biorefineries compared to other biomass valorisation technologies, assessing their efficiency and sustainability. Additionally, it examines the compliance of these biorefineries with the six environmental objectives of the European Taxonomy, highlighting their role in decarbonization and the circular economy, and promoting sustainable investment and market transparency to achieve the EU's climate goals for 2050.
This master thesis analyses how biorefineries, particularly carboxylate-based ones, contribute to sustainability by transforming biomass into valuable products such as biomethane and biofertilizers, aligning with the Europe 2020 Strategy and the European Taxonomy Regulation. It investigates the environmental performance indicators of biorefineries compared to other biomass valorisation technologies, assessing their efficiency and sustainability. Additionally, it examines the compliance of these biorefineries with the six environmental objectives of the European Taxonomy, highlighting their role in decarbonization and the circular economy, and promoting sustainable investment and market transparency to achieve the EU's climate goals for 2050.
Direction
MAURICIO IGLESIAS, MIGUEL (Tutorships)
HOSPIDO QUINTANA, ALMUDENA (Co-tutorships)
MAURICIO IGLESIAS, MIGUEL (Tutorships)
HOSPIDO QUINTANA, ALMUDENA (Co-tutorships)
Court
MOREIRA VILAR, MARIA TERESA (Chairman)
Rojo Alboreca, Alberto (Secretary)
FIOL LOPEZ, SARAH (Member)
MOREIRA VILAR, MARIA TERESA (Chairman)
Rojo Alboreca, Alberto (Secretary)
FIOL LOPEZ, SARAH (Member)
Reusing, Recycling and Reducing Pre-trained Models for Developing and Evaluating Green Data-to-text Systems: A Use Case with Meteorological Data
Authorship
A.V.C.
Master in artificial intelligence
A.V.C.
Master in artificial intelligence
Defense date
02.20.2024 13:00
02.20.2024 13:00
Summary
This Master's Thesis aims to advance the field of Natural Language Processing (NLP) towards enhanced environmental sustainability, with a focus on the reuse, recycling, and reduction of pre-trained language models. Our core objectives include (i) refining existing models for Spanish and Galician languages through efficient methods adaptable to new domains, genres, and languages; (ii) investigating the viability of approaches to adapt language models with a minimized carbon footprint; and (iii) developing a systematic pipeline for experimentation, facilitating the definition of baselines, creation of alternative models, text generation, and comprehensive evaluation, which includes assessing values related to energy consumption and the quality of generated text to determine the optimal model. The central theme underscores environmental responsibility in NLP, as exemplified in a use case involving meteorological data in Spanish and Galician languages.
This Master's Thesis aims to advance the field of Natural Language Processing (NLP) towards enhanced environmental sustainability, with a focus on the reuse, recycling, and reduction of pre-trained language models. Our core objectives include (i) refining existing models for Spanish and Galician languages through efficient methods adaptable to new domains, genres, and languages; (ii) investigating the viability of approaches to adapt language models with a minimized carbon footprint; and (iii) developing a systematic pipeline for experimentation, facilitating the definition of baselines, creation of alternative models, text generation, and comprehensive evaluation, which includes assessing values related to energy consumption and the quality of generated text to determine the optimal model. The central theme underscores environmental responsibility in NLP, as exemplified in a use case involving meteorological data in Spanish and Galician languages.
Direction
ALONSO MORAL, JOSE MARIA (Tutorships)
BUGARIN DIZ, ALBERTO JOSE (Co-tutorships)
GONZALEZ CORBELLE, JAVIER (Co-tutorships)
ALONSO MORAL, JOSE MARIA (Tutorships)
BUGARIN DIZ, ALBERTO JOSE (Co-tutorships)
GONZALEZ CORBELLE, JAVIER (Co-tutorships)
Court
MUCIENTES MOLINA, MANUEL FELIPE (Chairman)
ORDOÑEZ IGLESIAS, ALVARO (Secretary)
Sánchez Vila, Eduardo Manuel (Member)
MUCIENTES MOLINA, MANUEL FELIPE (Chairman)
ORDOÑEZ IGLESIAS, ALVARO (Secretary)
Sánchez Vila, Eduardo Manuel (Member)
Semantic Segmentation of Multispectral Remote Sensing Images Using Transformers
Authorship
P.C.G.
Master in artificial intelligence
P.C.G.
Master in artificial intelligence
Defense date
07.19.2024 10:30
07.19.2024 10:30
Summary
Semantic segmentation for Earth observation is a process that involves classifying each pixel in an image into predefined categories. For remote sensing images used in land cover and land use analysis, segmentation facilitates detailed analysis and understanding by identifying elements such as water bodies, vegetation, urban areas, and roads. Deep learning, particularly transformers, has been transformative in the field of semantic segmentation of remote sensing images, yielding significative better results than previous techniques due to their high generalization capability and adaptability to different properties of the images. In this work, a semantic segmentation model, Mask2Former, which has been developed for semantic segmentation of RGB images, is adapted to be used over multispectral images. Mask2Former is based on transformers and performs semantic segmentation by assigning masks to the classes of elements presents in the image. The 4-band multispectral images considered are those contained in the Five Billion Pixels dataset and captured by the Gaofen-2 satellite. Models trained with the proposed Mask2Former for multispectral images were compared with models trained with the original Mask2Former using only the RGB bands, as well as with state-of-the-art methods in the literature. The results demonstrated the benefits of using multispectral images in terms of standard metrics such as overall accuracy and IoU.
Semantic segmentation for Earth observation is a process that involves classifying each pixel in an image into predefined categories. For remote sensing images used in land cover and land use analysis, segmentation facilitates detailed analysis and understanding by identifying elements such as water bodies, vegetation, urban areas, and roads. Deep learning, particularly transformers, has been transformative in the field of semantic segmentation of remote sensing images, yielding significative better results than previous techniques due to their high generalization capability and adaptability to different properties of the images. In this work, a semantic segmentation model, Mask2Former, which has been developed for semantic segmentation of RGB images, is adapted to be used over multispectral images. Mask2Former is based on transformers and performs semantic segmentation by assigning masks to the classes of elements presents in the image. The 4-band multispectral images considered are those contained in the Five Billion Pixels dataset and captured by the Gaofen-2 satellite. Models trained with the proposed Mask2Former for multispectral images were compared with models trained with the original Mask2Former using only the RGB bands, as well as with state-of-the-art methods in the literature. The results demonstrated the benefits of using multispectral images in terms of standard metrics such as overall accuracy and IoU.
Direction
ORDOÑEZ IGLESIAS, ALVARO (Tutorships)
Argüello Pedreira, Francisco Santiago (Co-tutorships)
ORDOÑEZ IGLESIAS, ALVARO (Tutorships)
Argüello Pedreira, Francisco Santiago (Co-tutorships)
Court
LAMA PENIN, MANUEL (Chairman)
CONDORI FERNANDEZ, NELLY (Secretary)
FELIX LAMAS, PAULO (Member)
LAMA PENIN, MANUEL (Chairman)
CONDORI FERNANDEZ, NELLY (Secretary)
FELIX LAMAS, PAULO (Member)
Native Implementation of OpenMP For Python
Authorship
D.M.P.O.
Máster Universitario en Computación de Altas Prestacións / High Performance Computing from the University of A Coruña and the University of Santiago de Compostela
D.M.P.O.
Máster Universitario en Computación de Altas Prestacións / High Performance Computing from the University of A Coruña and the University of Santiago de Compostela
Defense date
09.03.2024 10:30
09.03.2024 10:30
Summary
It was previously impossible to obtain a speedup using multithreading in Python due to the Global Interpreter Lock in the main Python Implementation, CPython. In October 2024, Python 3.13 will be released, including PEP 703 making the Global Interpreter Lock Optional in CPython. Up to now, the only way to parallelize Python programs was to use multiple processes and Inter Process Communication. This context made the implementation of the OpenMP API pointless. With the Global Interpreter Lock removal in Python 3.13, providing simple multithreading APIs is relevant and useful.
It was previously impossible to obtain a speedup using multithreading in Python due to the Global Interpreter Lock in the main Python Implementation, CPython. In October 2024, Python 3.13 will be released, including PEP 703 making the Global Interpreter Lock Optional in CPython. Up to now, the only way to parallelize Python programs was to use multiple processes and Inter Process Communication. This context made the implementation of the OpenMP API pointless. With the Global Interpreter Lock removal in Python 3.13, providing simple multithreading APIs is relevant and useful.
Direction
PICHEL CAMPOS, JUAN CARLOS (Tutorships)
PIÑEIRO POMAR, CESAR ALFREDO (Co-tutorships)
PICHEL CAMPOS, JUAN CARLOS (Tutorships)
PIÑEIRO POMAR, CESAR ALFREDO (Co-tutorships)
Court
CABALEIRO DOMINGUEZ, JOSE CARLOS (Chairman)
Blanco Heras, Dora (Secretary)
González Domínguez, Jorge (Member)
CABALEIRO DOMINGUEZ, JOSE CARLOS (Chairman)
Blanco Heras, Dora (Secretary)
González Domínguez, Jorge (Member)
Automatizing hyperparameter optimization for Machine Learning in PyTorch with evolutionary algorithms
Authorship
R.R.
Máster Universitario en Computación de Altas Prestacións / High Performance Computing from the University of A Coruña and the University of Santiago de Compostela
R.R.
Máster Universitario en Computación de Altas Prestacións / High Performance Computing from the University of A Coruña and the University of Santiago de Compostela
Defense date
09.03.2024 11:00
09.03.2024 11:00
Summary
The purpose of this project is to create a Python library that automatizes the optimization and fine-tuning of hyperparameters for PyTorch ML models using evolutionary algorithms based on implementing concepts from natural selection: variable fitness, heritability, crossover, mutation, etc. There is an evolving population going through many generations, where each individual in the population is a set of hyperparameters. Examples of the hyperparameters to be tuned are: learning rate, batch size, number of epochs. For this, I use the DEAP (Distributed Evolutionary Algorithms for Python) Python open-source library which implements the building-blocks to customize different implementations of evolutionary algorithms.
The purpose of this project is to create a Python library that automatizes the optimization and fine-tuning of hyperparameters for PyTorch ML models using evolutionary algorithms based on implementing concepts from natural selection: variable fitness, heritability, crossover, mutation, etc. There is an evolving population going through many generations, where each individual in the population is a set of hyperparameters. Examples of the hyperparameters to be tuned are: learning rate, batch size, number of epochs. For this, I use the DEAP (Distributed Evolutionary Algorithms for Python) Python open-source library which implements the building-blocks to customize different implementations of evolutionary algorithms.
Direction
QUESADA BARRIUSO, PABLO (Tutorships)
Andrade Canosa, Diego (Co-tutorships)
QUESADA BARRIUSO, PABLO (Tutorships)
Andrade Canosa, Diego (Co-tutorships)
Court
CABALEIRO DOMINGUEZ, JOSE CARLOS (Chairman)
Blanco Heras, Dora (Secretary)
González Domínguez, Jorge (Member)
CABALEIRO DOMINGUEZ, JOSE CARLOS (Chairman)
Blanco Heras, Dora (Secretary)
González Domínguez, Jorge (Member)
Deep learning for computer vision: from cloud to edge.
Authorship
M.E.A.E.F.
University Master in Computer Vision
M.E.A.E.F.
University Master in Computer Vision
Defense date
02.06.2024 10:40
02.06.2024 10:40
Summary
This master’s thesis investigates the use of specific methods to optimize deep learning models across different frameworks and datasets. In PyTorch, three quantization techniques are studied using the MNIST dataset: dynamic post-training quantization (PTDQ), static post-training quantization (PTSQ), and quantization-aware training (QAT). At the same time, various quantization methods were explored in TensorFlow using the CIFAR-10 dataset, including standard quantization, output layer quantization, and input/output layer quantization. Additionally, specific knowledge distillation (KD) experiments were performed using TensorFlow; one with the Vision Transformer (ViT) model as the “teacher” and ResNet-50 as the “student”, showed unfavorable results, and the other experiment with ResNet-50 as the “teacher” ”, MobileNetV2 acts as a “student” to evaluate the impact on simpler networks. The purpose of these experiments is to understand how different quantization techniques and knowledge distillation affect the accuracy and efficiency of different deep learning model scenarios in the field of green computing.
This master’s thesis investigates the use of specific methods to optimize deep learning models across different frameworks and datasets. In PyTorch, three quantization techniques are studied using the MNIST dataset: dynamic post-training quantization (PTDQ), static post-training quantization (PTSQ), and quantization-aware training (QAT). At the same time, various quantization methods were explored in TensorFlow using the CIFAR-10 dataset, including standard quantization, output layer quantization, and input/output layer quantization. Additionally, specific knowledge distillation (KD) experiments were performed using TensorFlow; one with the Vision Transformer (ViT) model as the “teacher” and ResNet-50 as the “student”, showed unfavorable results, and the other experiment with ResNet-50 as the “teacher” ”, MobileNetV2 acts as a “student” to evaluate the impact on simpler networks. The purpose of these experiments is to understand how different quantization techniques and knowledge distillation affect the accuracy and efficiency of different deep learning model scenarios in the field of green computing.
Direction
BREA SANCHEZ, VICTOR MANUEL (Tutorships)
PARDO SECO, FERNANDO RAFAEL (Co-tutorships)
BREA SANCHEZ, VICTOR MANUEL (Tutorships)
PARDO SECO, FERNANDO RAFAEL (Co-tutorships)
Court
GARCIA TAHOCES, PABLO (Chairman)
López Martínez, Paula (Secretary)
Pardo López, Xosé Manuel (Member)
GARCIA TAHOCES, PABLO (Chairman)
López Martínez, Paula (Secretary)
Pardo López, Xosé Manuel (Member)
Olive oil structuring using ethyl cellulose of different molecular weights
Authorship
C.A.A.G.
Master in Chemical and Bioprocess Engineering
C.A.A.G.
Master in Chemical and Bioprocess Engineering
Defense date
07.09.2024 11:30
07.09.2024 11:30
Summary
In search of replacing saturated fats and trans fatty acids produced by the food industry, structuring techniques have been developed for oils rich in high concentrations of unsaturated fatty acids. Some of these techniques are based on the gelling of oils by adding a structuring agent through the direct method, which involves raising the temperature to the gelling point of the polymer used. In this study, the effects of ethyl cellulose (EC) mixture of different viscosity (22, 46 and 100 mPa·s) on the color properties, texture (hardness, elasticity, cohesiveness, adhesiveness, chewiness and gumminess), oil binding capacity (OBC), oxidation and rheology parameters of the oleogels were evaluated. For this purpose, simple oleogel systems with each viscosity and at different EC concentrations (8%, 10%, 12% and 15% w/w) were first formulated for olive oil gelation. Subsequently, three different oleogels were prepared from EC mixtures of different viscosities. The mixtures were defined as Mixture T, Mixture A and Mixture B, with ratios of 50:50, 25:75 and 75:25 of ECs of 22 and 100 mPa·s, respectively. The results allowed to establish that oleogels formulated from mixtures of EC of different molecular weights require a smaller amount of polymer to reproduce textural and rheological properties of oleogels formulated only with EC of 46 mPa·s. Regarding oxidation, it was shown that by increasing the proportion of EC of 22 mPa s in the mixtures studied, primary, secondary and total oxidation decreased, which indicates a protective potential of this EC on the oxidation of oleogels.
In search of replacing saturated fats and trans fatty acids produced by the food industry, structuring techniques have been developed for oils rich in high concentrations of unsaturated fatty acids. Some of these techniques are based on the gelling of oils by adding a structuring agent through the direct method, which involves raising the temperature to the gelling point of the polymer used. In this study, the effects of ethyl cellulose (EC) mixture of different viscosity (22, 46 and 100 mPa·s) on the color properties, texture (hardness, elasticity, cohesiveness, adhesiveness, chewiness and gumminess), oil binding capacity (OBC), oxidation and rheology parameters of the oleogels were evaluated. For this purpose, simple oleogel systems with each viscosity and at different EC concentrations (8%, 10%, 12% and 15% w/w) were first formulated for olive oil gelation. Subsequently, three different oleogels were prepared from EC mixtures of different viscosities. The mixtures were defined as Mixture T, Mixture A and Mixture B, with ratios of 50:50, 25:75 and 75:25 of ECs of 22 and 100 mPa·s, respectively. The results allowed to establish that oleogels formulated from mixtures of EC of different molecular weights require a smaller amount of polymer to reproduce textural and rheological properties of oleogels formulated only with EC of 46 mPa·s. Regarding oxidation, it was shown that by increasing the proportion of EC of 22 mPa s in the mixtures studied, primary, secondary and total oxidation decreased, which indicates a protective potential of this EC on the oxidation of oleogels.
Direction
MOREIRA MARTINEZ, RAMON FELIPE (Tutorships)
FRANCO RUIZ, DANIEL JOSE (Co-tutorships)
MOREIRA MARTINEZ, RAMON FELIPE (Tutorships)
FRANCO RUIZ, DANIEL JOSE (Co-tutorships)
Court
FEIJOO COSTA, GUMERSINDO (Chairman)
DIAZ JULLIEN, CRISTINA (Secretary)
FERNANDEZ CARRASCO, EUGENIO (Member)
FEIJOO COSTA, GUMERSINDO (Chairman)
DIAZ JULLIEN, CRISTINA (Secretary)
FERNANDEZ CARRASCO, EUGENIO (Member)
Targeted production of carboxylic acids in microbial fermentations: the effect of culture medium
Authorship
M.B.F.
Master in Chemical and Bioprocess Engineering
M.B.F.
Master in Chemical and Bioprocess Engineering
Defense date
07.09.2024 10:30
07.09.2024 10:30
Summary
Today's society has a high dependence on petrochemicals, and is faced with the generation of large amounts of waste. In line with the objectives of sustainable development, this study focuses on the production of volatile fatty acids by anaerobic fermentation, which is attractive for the utilization of organic waste. Specifically, the effect of the modification of the culture medium (complex vs. mineral) and the mode of operation (continuous vs. semi-continuous) using glucose as substrate on the targeted production of volatile fatty acids is investigated. The results show higher production in nutrient-rich media, although with differences in composition with respect to production in poor medium. Furthermore, hexanoic acid production was achieved much more effectively in a continuous reactor than in a semi-continuous reactor. It therefore highlights the importance of considering both the culture medium and the mode of reactor operation to optimize the production of volatile fatty acids. *** Translated with www.DeepL.com/Translator (free version) ***
Today's society has a high dependence on petrochemicals, and is faced with the generation of large amounts of waste. In line with the objectives of sustainable development, this study focuses on the production of volatile fatty acids by anaerobic fermentation, which is attractive for the utilization of organic waste. Specifically, the effect of the modification of the culture medium (complex vs. mineral) and the mode of operation (continuous vs. semi-continuous) using glucose as substrate on the targeted production of volatile fatty acids is investigated. The results show higher production in nutrient-rich media, although with differences in composition with respect to production in poor medium. Furthermore, hexanoic acid production was achieved much more effectively in a continuous reactor than in a semi-continuous reactor. It therefore highlights the importance of considering both the culture medium and the mode of reactor operation to optimize the production of volatile fatty acids. *** Translated with www.DeepL.com/Translator (free version) ***
Direction
REGUEIRA LOPEZ, ALBERTE (Tutorships)
Cubero Cardoso, Juan (Co-tutorships)
REGUEIRA LOPEZ, ALBERTE (Tutorships)
Cubero Cardoso, Juan (Co-tutorships)
Court
FEIJOO COSTA, GUMERSINDO (Chairman)
DIAZ JULLIEN, CRISTINA (Secretary)
FERNANDEZ CARRASCO, EUGENIO (Member)
FEIJOO COSTA, GUMERSINDO (Chairman)
DIAZ JULLIEN, CRISTINA (Secretary)
FERNANDEZ CARRASCO, EUGENIO (Member)
Drying kinetics and characteristics of olive oil and chitosan oleogels
Authorship
M.N.S.L.C.
Master in Chemical and Bioprocess Engineering
M.N.S.L.C.
Master in Chemical and Bioprocess Engineering
Defense date
02.15.2024 09:00
02.15.2024 09:00
Summary
The replacement of unhealthy saturated fats with healthier options is currently one of the main objectives of the food industry. One alternative is to use vegetable oils with good nutritional properties stabilized in the form of oleogels for the replacement of saturated and trans fats. In this work, olive oil oleogels are studied using chitosan and vanillin as structuring agents. Olive oil (main component) is trapped in a three-dimensional network built by means of a Schiff base (imine) formed between the functional groups of chitosan (amines) and vanillin (aldehydes), applying an indirect method (emulsion-template). This consists of the formation of oil-water emulsions and subsequent elimination of water. Two types of emulsions were generated with different concentrations of chitosan (0.7 and 0.8% by weight) maintaining a constant vanillin/chitosan ratio (1.3) and a dispersed phase (oil)/continuous phase (water) mass ratio of 50:50 by weight, which were dehydrated by two dehydration techniques: convective drying at four temperatures (50, 60, 70 and 80 C) and freeze-drying. The experimental results obtained made it possible to determine the drying kinetics of the emulsions at different temperatures, and their subsequent modeling, through empirical and diffusional models. Product texture properties (hardness, adhesiveness, cohesiveness, springiness) and quality properties (oil retention, color and oxidation) were analyzed. The optimum conditions of the drying process corresponded to a temperature of 70 C, since at higher air temperatures the drying kinetics were not accelerated. It was concluded that the drying time and the concentration of the gelling agent are fundamental for the quality and texture parameters. Hardness increased with gelling agent concentration and decreased with temperature, while the other properties do the reverse. Oil retention is significantly better as chitosan concentration increases. On the other hand, high oxidation values were obtained due to the presence of aldehydes in the system, coming from vanillin, which interfere with the adopted methodologies. With respect to the freeze-dried samples, the change in color was lower and there were no browning reactions, indicating high quality in this aspect. In terms of texture, samples were softer and more elastic than those obtained by conventional drying, with high adhesiveness. In addition, oil retention was lower. In conclusion, conventional drying generates samples with better properties and quality than freeze-dried ones; however, it is necessary to consider the important change in color that occurs together with an economic analysis of the options.
The replacement of unhealthy saturated fats with healthier options is currently one of the main objectives of the food industry. One alternative is to use vegetable oils with good nutritional properties stabilized in the form of oleogels for the replacement of saturated and trans fats. In this work, olive oil oleogels are studied using chitosan and vanillin as structuring agents. Olive oil (main component) is trapped in a three-dimensional network built by means of a Schiff base (imine) formed between the functional groups of chitosan (amines) and vanillin (aldehydes), applying an indirect method (emulsion-template). This consists of the formation of oil-water emulsions and subsequent elimination of water. Two types of emulsions were generated with different concentrations of chitosan (0.7 and 0.8% by weight) maintaining a constant vanillin/chitosan ratio (1.3) and a dispersed phase (oil)/continuous phase (water) mass ratio of 50:50 by weight, which were dehydrated by two dehydration techniques: convective drying at four temperatures (50, 60, 70 and 80 C) and freeze-drying. The experimental results obtained made it possible to determine the drying kinetics of the emulsions at different temperatures, and their subsequent modeling, through empirical and diffusional models. Product texture properties (hardness, adhesiveness, cohesiveness, springiness) and quality properties (oil retention, color and oxidation) were analyzed. The optimum conditions of the drying process corresponded to a temperature of 70 C, since at higher air temperatures the drying kinetics were not accelerated. It was concluded that the drying time and the concentration of the gelling agent are fundamental for the quality and texture parameters. Hardness increased with gelling agent concentration and decreased with temperature, while the other properties do the reverse. Oil retention is significantly better as chitosan concentration increases. On the other hand, high oxidation values were obtained due to the presence of aldehydes in the system, coming from vanillin, which interfere with the adopted methodologies. With respect to the freeze-dried samples, the change in color was lower and there were no browning reactions, indicating high quality in this aspect. In terms of texture, samples were softer and more elastic than those obtained by conventional drying, with high adhesiveness. In addition, oil retention was lower. In conclusion, conventional drying generates samples with better properties and quality than freeze-dried ones; however, it is necessary to consider the important change in color that occurs together with an economic analysis of the options.
Direction
MOREIRA MARTINEZ, RAMON FELIPE (Tutorships)
FRANCO URIA, MARIA AMAYA (Co-tutorships)
MOREIRA MARTINEZ, RAMON FELIPE (Tutorships)
FRANCO URIA, MARIA AMAYA (Co-tutorships)
Court
MOSQUERA CORRAL, ANUSKA (Chairman)
RODRIGUEZ FERNANDEZ, MANUEL DAMASO (Secretary)
RODIL RODRIGUEZ, EVA (Member)
MOSQUERA CORRAL, ANUSKA (Chairman)
RODRIGUEZ FERNANDEZ, MANUEL DAMASO (Secretary)
RODIL RODRIGUEZ, EVA (Member)
Development of a decision support model for group quotation in the hotel industry.
Authorship
A.P.F.
Master in Masive Data Analisys Tecnologies: Big Data
A.P.F.
Master in Masive Data Analisys Tecnologies: Big Data
Defense date
07.15.2024 16:00
07.15.2024 16:00
Summary
In the hotel sector, a group accommodation booking request can represent a significant opportunity and source of profit for a chain like Eurostars Hotel Company. Unlike an individual booking, a group stay generally receives a price reduction, given the high guarantee of occupancy and the potential for contracting additional services that it provides to the company. The Revenue management department is responsible for managing this request and making the decision, taking into account the complex context that surrounds it, whether to accept the request or not and, if so, to set an appropriate price. The purpose of this work is to develop a model that supports the different stages of the request management process, leveraging the available internal data records. To this end, in addition to automating a series of manual processes, we will seek to benefit from the Big Data techniques studied in the program. The aforementioned context includes aspects such as the expected demand or the expected room price for a given day, values that will be the target of a predictive model. Finally, we will create a visual tool that allows for easy use of the developed model.
In the hotel sector, a group accommodation booking request can represent a significant opportunity and source of profit for a chain like Eurostars Hotel Company. Unlike an individual booking, a group stay generally receives a price reduction, given the high guarantee of occupancy and the potential for contracting additional services that it provides to the company. The Revenue management department is responsible for managing this request and making the decision, taking into account the complex context that surrounds it, whether to accept the request or not and, if so, to set an appropriate price. The purpose of this work is to develop a model that supports the different stages of the request management process, leveraging the available internal data records. To this end, in addition to automating a series of manual processes, we will seek to benefit from the Big Data techniques studied in the program. The aforementioned context includes aspects such as the expected demand or the expected room price for a given day, values that will be the target of a predictive model. Finally, we will create a visual tool that allows for easy use of the developed model.
Direction
MUCIENTES MOLINA, MANUEL FELIPE (Tutorships)
Comesaña García, Alejandra (Co-tutorships)
Freire Martínez, Ignacio (Co-tutorships)
MUCIENTES MOLINA, MANUEL FELIPE (Tutorships)
Comesaña García, Alejandra (Co-tutorships)
Freire Martínez, Ignacio (Co-tutorships)
Court
Argüello Pedreira, Francisco Santiago (Chairman)
LAMA PENIN, MANUEL (Secretary)
FELIX LAMAS, PAULO (Member)
Argüello Pedreira, Francisco Santiago (Chairman)
LAMA PENIN, MANUEL (Secretary)
FELIX LAMAS, PAULO (Member)
Study of Eco-Efficiency in Intensive Barley Cultivation in Italy
Authorship
M.D.M.M.
Master in Environmental Engineering (3rd ed)
M.D.M.M.
Master in Environmental Engineering (3rd ed)
Defense date
09.09.2024 09:45
09.09.2024 09:45
Summary
This study evaluates the eco-efficiency of intensive barley cultivation in Italy using Life Cycle Assessment (LCA) and Data Envelopment Analysis (DEA) of inputs oriented. The main objective is to improve the environmental sustainability of this crop by reducing the environmental impacts associated with barley production. Data from various regions of central Italy were used, and a comparative assessment was conducted to identify the most efficient agricultural practices. The study concludes that the adoption of optimized agricultural practices can significantly improve eco-efficiency, reducing greenhouse gas emissions and other environmental impacts.
This study evaluates the eco-efficiency of intensive barley cultivation in Italy using Life Cycle Assessment (LCA) and Data Envelopment Analysis (DEA) of inputs oriented. The main objective is to improve the environmental sustainability of this crop by reducing the environmental impacts associated with barley production. Data from various regions of central Italy were used, and a comparative assessment was conducted to identify the most efficient agricultural practices. The study concludes that the adoption of optimized agricultural practices can significantly improve eco-efficiency, reducing greenhouse gas emissions and other environmental impacts.
Direction
GONZALEZ GARCIA, SARA (Tutorships)
GONZALEZ GARCIA, SARA (Tutorships)
Court
OTERO PEREZ, XOSE LOIS (Chairman)
SOUTO GONZALEZ, JOSE ANTONIO (Secretary)
ALDREY VAZQUEZ, JOSE ANTONIO (Member)
OTERO PEREZ, XOSE LOIS (Chairman)
SOUTO GONZALEZ, JOSE ANTONIO (Secretary)
ALDREY VAZQUEZ, JOSE ANTONIO (Member)
Obtention of odd chaim fatty acids through lactate fermentation
Authorship
C.A.M.P.
Master in Chemical and Bioprocess Engineering
C.A.M.P.
Master in Chemical and Bioprocess Engineering
Defense date
09.11.2024 12:00
09.11.2024 12:00
Summary
Currently, a large amount of organic waste is generated and it keeps increasing, and this is why alternatives must be sought to reduce the generation of waste and a way to valorize it. An interesting platform is the carboxylate platform, which allows the production of fatty acids from organic matter by using acidogenic fermentation. Within this fermentation there are different intermediates that lead to different fatty acids, in which lactate stands out. Lactate is the intermediate for the production of propionic acid, which is the main precursor of the odd-chain fatty acids. Odd-chain fatty acids are relevant because they are the intermediates for the production of bioplastics. Due to the influence that operational parameters have, studying their effect is determinant at the moment of wanting to improve the process. The effects of three operational parameters were studied, culture medium, pH and substrate inoculum ratio, to design an optimal experimental scheme. The results obtained from these experimental assays show that the addition of yeast extract does not improve the production of fatty acids in a significant way. The pH influences the proportion of the fatty acid spectrum, the adaptation of the bacteria to the medium and the rate of substrate consumption. At pH 5.5, fermentation is not stable and the lag phase is longer. The higher the inoculum to substrate ratio, the higher the concentration of fatty acids and the more fatty acids appear in the spectrum. The production of medium chain fatty acids requires electron donors, where it depends on the concentration of the electron donors and the electron donor. Because of this, the metabolism for chain elongation must be studied in depth and other possible electron donors must be investigated.
Currently, a large amount of organic waste is generated and it keeps increasing, and this is why alternatives must be sought to reduce the generation of waste and a way to valorize it. An interesting platform is the carboxylate platform, which allows the production of fatty acids from organic matter by using acidogenic fermentation. Within this fermentation there are different intermediates that lead to different fatty acids, in which lactate stands out. Lactate is the intermediate for the production of propionic acid, which is the main precursor of the odd-chain fatty acids. Odd-chain fatty acids are relevant because they are the intermediates for the production of bioplastics. Due to the influence that operational parameters have, studying their effect is determinant at the moment of wanting to improve the process. The effects of three operational parameters were studied, culture medium, pH and substrate inoculum ratio, to design an optimal experimental scheme. The results obtained from these experimental assays show that the addition of yeast extract does not improve the production of fatty acids in a significant way. The pH influences the proportion of the fatty acid spectrum, the adaptation of the bacteria to the medium and the rate of substrate consumption. At pH 5.5, fermentation is not stable and the lag phase is longer. The higher the inoculum to substrate ratio, the higher the concentration of fatty acids and the more fatty acids appear in the spectrum. The production of medium chain fatty acids requires electron donors, where it depends on the concentration of the electron donors and the electron donor. Because of this, the metabolism for chain elongation must be studied in depth and other possible electron donors must be investigated.
Direction
MAURICIO IGLESIAS, MIGUEL (Tutorships)
REGUEIRA LOPEZ, ALBERTE (Co-tutorships)
MAURICIO IGLESIAS, MIGUEL (Tutorships)
REGUEIRA LOPEZ, ALBERTE (Co-tutorships)
Court
CASARES LONG, JUAN JOSE (Chairman)
EIBES GONZALEZ, GEMMA MARIA (Secretary)
FRANCO URIA, MARIA AMAYA (Member)
CASARES LONG, JUAN JOSE (Chairman)
EIBES GONZALEZ, GEMMA MARIA (Secretary)
FRANCO URIA, MARIA AMAYA (Member)
Environmental assessment of the production of phenolic compounds and bioethanol from a waste stream of aronia juice production.
Authorship
I.N.P.
Master in Chemical and Bioprocess Engineering
I.N.P.
Master in Chemical and Bioprocess Engineering
Defense date
02.15.2024 10:30
02.15.2024 10:30
Summary
With the aim of promoting the circularity and sustainability of agri-food industry processes, this Master Thesis deals with the recovery of a waste stream from the production of aronia juice, a red fruit rich in antioxidants. To this end, a biorefinery scenario is proposed in which two main products are obtained: phenolic compounds with a high added value in the nutraceutical and pharmaceutical product market, and bioethanol as a biotechnological alternative for the reduction of fossil fuels in the transport sector. For the development of this biotechnological production model, the data available in the bibliographical references were used and the large-scale biorefinery model was developed in the SuperPro Designer software. Subsequently, to assess the degree of sustainability of the process, the Life Cycle Analysis methodology was used, following the guidelines defined in the ISO 14040 and PAS 2050 standards, as well as circularity indicators, thus identifying the environmental and circular profile of the process. Furthermore, once the stages of the process with the greatest environmental contribution were identified, a sensitivity analysis was carried out with the aim of improving the quality and the sustainability and circularity potential of the biorefinery.
With the aim of promoting the circularity and sustainability of agri-food industry processes, this Master Thesis deals with the recovery of a waste stream from the production of aronia juice, a red fruit rich in antioxidants. To this end, a biorefinery scenario is proposed in which two main products are obtained: phenolic compounds with a high added value in the nutraceutical and pharmaceutical product market, and bioethanol as a biotechnological alternative for the reduction of fossil fuels in the transport sector. For the development of this biotechnological production model, the data available in the bibliographical references were used and the large-scale biorefinery model was developed in the SuperPro Designer software. Subsequently, to assess the degree of sustainability of the process, the Life Cycle Analysis methodology was used, following the guidelines defined in the ISO 14040 and PAS 2050 standards, as well as circularity indicators, thus identifying the environmental and circular profile of the process. Furthermore, once the stages of the process with the greatest environmental contribution were identified, a sensitivity analysis was carried out with the aim of improving the quality and the sustainability and circularity potential of the biorefinery.
Direction
MOREIRA VILAR, MARIA TERESA (Tutorships)
Arias Calvo, Ana (Co-tutorships)
MOREIRA VILAR, MARIA TERESA (Tutorships)
Arias Calvo, Ana (Co-tutorships)
Court
ROCA BORDELLO, ENRIQUE (Chairman)
VAL DEL RIO, MARIA ANGELES (Secretary)
RODRIGUEZ OSORIO, CARLOS (Member)
ROCA BORDELLO, ENRIQUE (Chairman)
VAL DEL RIO, MARIA ANGELES (Secretary)
RODRIGUEZ OSORIO, CARLOS (Member)
Assessment of the Continuous Partial Nitritation Process for Efficient Nitrogen Removal in the Sludge Line of a Wastewater Treatment Plant.
Authorship
A.P.R.B.
Master in Chemical and Bioprocess Engineering
A.P.R.B.
Master in Chemical and Bioprocess Engineering
Defense date
02.15.2024 11:00
02.15.2024 11:00
Summary
The removal of nitrogen and phosphorus in wastewater treatment plants constitutes a matter of considerable significance, suitable to the consistent efforts of European legislation to reduce discharge limits. The overarching objective is to address concerns such as eutrophication, toxicity in microorganisms, and contamination of receiving bodies of water. For this reason, improving the sludge line is crucial, as it contains the highest amount of nitrogen in wastewater treatment plants (WWTPs). Specifically, water from sludge dehydration or the supernatant from anaerobic has concentrations of ammonium exceeding 400 mg N NH4/L. Several technologies have been proposed for nitrogen removal, with partial nitritation anammox being of particular interest in this research. In this process, 50 of ammonium is oxidized to nitrite to meet the requisite levels for the anammox system. Although this system has been used in pilot processes at both laboratory and industrial scales, its implementation in a continuous system remains unexplored, making it the central focus of this study. The primary objective of this research was to find operational conditions that generate the highest possible percentage of partial nitritation. Variables such as pH, dissolved oxygen, hydraulic retention time (HRT), and sludge retention time (SRT) were subjected to analysis to effectuate the inhibition of nitrite-oxidizing bacteria. The study commenced with a nitrogen level equivalent to 100 mg N NH4/L., progressively increased as the available inorganic carbon (IC) approached values between 1 a 2 mg IC/L. The best result for partial nitritation was achieved on the day 28 of operation, generating a nitrite concentration of 196,32 mg N NO2/L. Upon the culmination of the operational duration of the partial nitritation reactor, the activity of ammonium-oxidizing bacteria, which play a crucial role in the nitritation process, was evaluated. The registered value was 0,32 mg N/mg SSV d, constituting half of the value found in literature. However, this is justified by the significant amount of biomass adhered to the reactor walls and the retained sludge in recirculation. According to the results of this research, it is considered that the implementation of partial nitritation in a continuous reactor for high nitrogen levels can be considered possible, as long as inhibitory factors for nitrite oxidizing bacteria are kept under control.
The removal of nitrogen and phosphorus in wastewater treatment plants constitutes a matter of considerable significance, suitable to the consistent efforts of European legislation to reduce discharge limits. The overarching objective is to address concerns such as eutrophication, toxicity in microorganisms, and contamination of receiving bodies of water. For this reason, improving the sludge line is crucial, as it contains the highest amount of nitrogen in wastewater treatment plants (WWTPs). Specifically, water from sludge dehydration or the supernatant from anaerobic has concentrations of ammonium exceeding 400 mg N NH4/L. Several technologies have been proposed for nitrogen removal, with partial nitritation anammox being of particular interest in this research. In this process, 50 of ammonium is oxidized to nitrite to meet the requisite levels for the anammox system. Although this system has been used in pilot processes at both laboratory and industrial scales, its implementation in a continuous system remains unexplored, making it the central focus of this study. The primary objective of this research was to find operational conditions that generate the highest possible percentage of partial nitritation. Variables such as pH, dissolved oxygen, hydraulic retention time (HRT), and sludge retention time (SRT) were subjected to analysis to effectuate the inhibition of nitrite-oxidizing bacteria. The study commenced with a nitrogen level equivalent to 100 mg N NH4/L., progressively increased as the available inorganic carbon (IC) approached values between 1 a 2 mg IC/L. The best result for partial nitritation was achieved on the day 28 of operation, generating a nitrite concentration of 196,32 mg N NO2/L. Upon the culmination of the operational duration of the partial nitritation reactor, the activity of ammonium-oxidizing bacteria, which play a crucial role in the nitritation process, was evaluated. The registered value was 0,32 mg N/mg SSV d, constituting half of the value found in literature. However, this is justified by the significant amount of biomass adhered to the reactor walls and the retained sludge in recirculation. According to the results of this research, it is considered that the implementation of partial nitritation in a continuous reactor for high nitrogen levels can be considered possible, as long as inhibitory factors for nitrite oxidizing bacteria are kept under control.
Direction
MOSQUERA CORRAL, ANUSKA (Tutorships)
VAL DEL RIO, MARIA ANGELES (Co-tutorships)
Pedrouso Fuentes, Alba (Co-tutorships)
MOSQUERA CORRAL, ANUSKA (Tutorships)
VAL DEL RIO, MARIA ANGELES (Co-tutorships)
Pedrouso Fuentes, Alba (Co-tutorships)
Court
SOTO CAMPOS, ANA MARIA (Chairman)
Balboa Méndez, Sabela (Secretary)
SOUTO GONZALEZ, JOSE ANTONIO (Member)
SOTO CAMPOS, ANA MARIA (Chairman)
Balboa Méndez, Sabela (Secretary)
SOUTO GONZALEZ, JOSE ANTONIO (Member)
Machine Learning versus Econometric Models: Which is Better for Predicting Decisions in Tourism?
Authorship
S.I.V.R.
Master in Masive Data Analisys Tecnologies: Big Data
S.I.V.R.
Master in Masive Data Analisys Tecnologies: Big Data
Defense date
09.13.2024 16:30
09.13.2024 16:30
Summary
This master's thesis aims to compare the effectiveness of machine learning models and traditional econometric models in predicting decisions in the tourism sector, specifically in the routes of the Camino de Santiago. The work is carried out in the context of the project “Comparing the Performance of Choice-Based Models: A Culture Evaluation” developed by Dr. Yago Atrio Lema under the supervision of Professor Eduardo Sánchez. Through a literature review and the analysis of data collected between 2003 and 2023, different predictive models are evaluated and compared. The study includes data preprocessing, hyperparameter selection, and the use of specific metrics to measure model performance. The results indicate that machine learning models, although complex, offer high predictive capability that can complement and, in some cases, surpass traditional econometric models.
This master's thesis aims to compare the effectiveness of machine learning models and traditional econometric models in predicting decisions in the tourism sector, specifically in the routes of the Camino de Santiago. The work is carried out in the context of the project “Comparing the Performance of Choice-Based Models: A Culture Evaluation” developed by Dr. Yago Atrio Lema under the supervision of Professor Eduardo Sánchez. Through a literature review and the analysis of data collected between 2003 and 2023, different predictive models are evaluated and compared. The study includes data preprocessing, hyperparameter selection, and the use of specific metrics to measure model performance. The results indicate that machine learning models, although complex, offer high predictive capability that can complement and, in some cases, surpass traditional econometric models.
Direction
Sánchez Vila, Eduardo Manuel (Tutorships)
Sánchez Vila, Eduardo Manuel (Tutorships)
Court
Fernández Pena, Anselmo Tomás (Chairman)
MOSQUERA GONZALEZ, ANTONIO (Secretary)
Triñanes Fernández, Joaquín Ángel (Member)
Fernández Pena, Anselmo Tomás (Chairman)
MOSQUERA GONZALEZ, ANTONIO (Secretary)
Triñanes Fernández, Joaquín Ángel (Member)
Evaluation of Cube as a semantic layer for a data-driven organization.
Authorship
A.S.A.
Master in Masive Data Analisys Tecnologies: Big Data
A.S.A.
Master in Masive Data Analisys Tecnologies: Big Data
Defense date
07.16.2024 10:30
07.16.2024 10:30
Summary
Business activities generate large amounts of data, which are used to facilitate and improve decision making. However, the constant growth of data leads to its decentralization, dividing it across different models and databases to facilitate its management. From a business perspective, access to data should be centralized and unified, allowing to combine data from different models and sources to improve decision making. As a result, data-driven companies are in need to tools that help aggregate all available data and display it in a way that business users can understand. In this context, this work focused on evaluating Cube, a semantic layer tool designed to fulfill the aforementioned purpose. The aim of this evaluation was to understand how it works and the possibilities it offers for use at FDS, A DXC Technology Company, which is looking for a semantic layer tool to use with the company's various customers. This evaluation comprised three phases, starting by defining the company's requirements for accepting the tool. Secondly, a semantic layer was created using Cube on an sample model, to determine whether it meets the established requirements. Next, usability and performance tests were carried out to assess the final viability of the tool. Once this process was completed, a final evaluation was carried out, including both the technical experience of using Cube and observations in terms of data-model definition, usability, and performance. After this first evaluation, multiple paths have opened for further investigation in case the FDS company desires to continue evaluating the viability of Cube for its customers.
Business activities generate large amounts of data, which are used to facilitate and improve decision making. However, the constant growth of data leads to its decentralization, dividing it across different models and databases to facilitate its management. From a business perspective, access to data should be centralized and unified, allowing to combine data from different models and sources to improve decision making. As a result, data-driven companies are in need to tools that help aggregate all available data and display it in a way that business users can understand. In this context, this work focused on evaluating Cube, a semantic layer tool designed to fulfill the aforementioned purpose. The aim of this evaluation was to understand how it works and the possibilities it offers for use at FDS, A DXC Technology Company, which is looking for a semantic layer tool to use with the company's various customers. This evaluation comprised three phases, starting by defining the company's requirements for accepting the tool. Secondly, a semantic layer was created using Cube on an sample model, to determine whether it meets the established requirements. Next, usability and performance tests were carried out to assess the final viability of the tool. Once this process was completed, a final evaluation was carried out, including both the technical experience of using Cube and observations in terms of data-model definition, usability, and performance. After this first evaluation, multiple paths have opened for further investigation in case the FDS company desires to continue evaluating the viability of Cube for its customers.
Direction
RIOS VIQUEIRA, JOSE RAMON (Tutorships)
Graña Omil, Ángel (Co-tutorships)
RIOS VIQUEIRA, JOSE RAMON (Tutorships)
Graña Omil, Ángel (Co-tutorships)
Court
Losada Carril, David Enrique (Chairman)
Blanco Heras, Dora (Secretary)
AMEIJEIRAS ALONSO, JOSE (Member)
Losada Carril, David Enrique (Chairman)
Blanco Heras, Dora (Secretary)
AMEIJEIRAS ALONSO, JOSE (Member)
development of an application for the exploitation of semantic layers by voice
Authorship
M.S.O.A.
Master in Masive Data Analisys Tecnologies: Big Data
M.S.O.A.
Master in Masive Data Analisys Tecnologies: Big Data
Defense date
07.15.2024 16:30
07.15.2024 16:30
Summary
A semantic layer is an abstraction used in Business Intelligence systems for the interpretation and management of data in an understandable and useful way for end users, without the need for them to have advanced technical knowledge. It converts raw data from the data source into structured information, aligning terminology and concepts with business needs. We present the development of an application that attempts to exploit semantic layers using voice and natural language as an innovative approach to information access. This approach not only improves data accessibility and data access efficiency, but also has the potential to leverage BI tools, allowing the business to have an increase in productivity by abstracting users from thinking and typing the query, to focus more on decision making tasks based on the retrieved data. The development of this application is divided into multiple stages: first, the connection will be established between the source database dumped in Oracle to the work environment in which the semantic layer will be created (Cube Cloud in this case). Once a functional semantic layer is in place, metadata is extracted from the model's dimension tables and fact table to be stored in vector format. From the vector storage, an LLM model is created based on the OpenAI API adapted to the need of this work. Finally, the visual tool is created using the Streamlit library in Python to allow user interaction with the application.
A semantic layer is an abstraction used in Business Intelligence systems for the interpretation and management of data in an understandable and useful way for end users, without the need for them to have advanced technical knowledge. It converts raw data from the data source into structured information, aligning terminology and concepts with business needs. We present the development of an application that attempts to exploit semantic layers using voice and natural language as an innovative approach to information access. This approach not only improves data accessibility and data access efficiency, but also has the potential to leverage BI tools, allowing the business to have an increase in productivity by abstracting users from thinking and typing the query, to focus more on decision making tasks based on the retrieved data. The development of this application is divided into multiple stages: first, the connection will be established between the source database dumped in Oracle to the work environment in which the semantic layer will be created (Cube Cloud in this case). Once a functional semantic layer is in place, metadata is extracted from the model's dimension tables and fact table to be stored in vector format. From the vector storage, an LLM model is created based on the OpenAI API adapted to the need of this work. Finally, the visual tool is created using the Streamlit library in Python to allow user interaction with the application.
Direction
Losada Carril, David Enrique (Tutorships)
Sanz Anchelergues, Adolfo (Co-tutorships)
Losada Carril, David Enrique (Tutorships)
Sanz Anchelergues, Adolfo (Co-tutorships)
Court
MUCIENTES MOLINA, MANUEL FELIPE (Chairman)
López Martínez, Paula (Secretary)
RIOS VIQUEIRA, JOSE RAMON (Member)
MUCIENTES MOLINA, MANUEL FELIPE (Chairman)
López Martínez, Paula (Secretary)
RIOS VIQUEIRA, JOSE RAMON (Member)
Combinatorial Optimization in Predictive Monitoring
Authorship
A.T.R.
Master in Masive Data Analisys Tecnologies: Big Data
A.T.R.
Master in Masive Data Analisys Tecnologies: Big Data
Defense date
09.13.2024 17:00
09.13.2024 17:00
Summary
This work offers a study on the optimization of sequence-to-sequence models for task prediction in a predictive monitoring system for business processes. Predictive monitoring is crucial for anticipating the behavior of ongoing processes and improving decision-making. Initially, various optimization techniques were investigated, both based on deep learning and meta-heuristics, to enhance the inference phase of the ASTON system. Each technique was evaluated based on its ability to handle specific constraints, the quality of available training data, and the variability in trace lengths. Ultimately, the genetic algorithm (GA) was chosen for its demonstrated efficiency in solving complex combinatorial optimization problems and its adaptability to the problem at hand. The proposed GA iteratively improves a population of potential solutions through selection, crossover, and mutation, maximizing the conditional probability of predicted suffixes. The effects of hyperparameters on the achieved solutions and the feasibility of the developed system for prediction will be studied.
This work offers a study on the optimization of sequence-to-sequence models for task prediction in a predictive monitoring system for business processes. Predictive monitoring is crucial for anticipating the behavior of ongoing processes and improving decision-making. Initially, various optimization techniques were investigated, both based on deep learning and meta-heuristics, to enhance the inference phase of the ASTON system. Each technique was evaluated based on its ability to handle specific constraints, the quality of available training data, and the variability in trace lengths. Ultimately, the genetic algorithm (GA) was chosen for its demonstrated efficiency in solving complex combinatorial optimization problems and its adaptability to the problem at hand. The proposed GA iteratively improves a population of potential solutions through selection, crossover, and mutation, maximizing the conditional probability of predicted suffixes. The effects of hyperparameters on the achieved solutions and the feasibility of the developed system for prediction will be studied.
Direction
LAMA PENIN, MANUEL (Tutorships)
VIDAL AGUIAR, JUAN CARLOS (Co-tutorships)
RAMA MANEIRO, EFREN (Co-tutorships)
LAMA PENIN, MANUEL (Tutorships)
VIDAL AGUIAR, JUAN CARLOS (Co-tutorships)
RAMA MANEIRO, EFREN (Co-tutorships)
Court
Fernández Pena, Anselmo Tomás (Chairman)
MOSQUERA GONZALEZ, ANTONIO (Secretary)
Triñanes Fernández, Joaquín Ángel (Member)
Fernández Pena, Anselmo Tomás (Chairman)
MOSQUERA GONZALEZ, ANTONIO (Secretary)
Triñanes Fernández, Joaquín Ángel (Member)
Development of a reservation cancellation prediction model in the tourism sector.
Authorship
N.F.Q.
Master in Masive Data Analisys Tecnologies: Big Data
N.F.Q.
Master in Masive Data Analisys Tecnologies: Big Data
Defense date
07.16.2024 11:00
07.16.2024 11:00
Summary
In the hotel industry, the uncertainty caused by booking cancellations makes it difficult to forecast availability in the establishment, leading to significant profit losses. This project conducts an in-depth study of the cancellation patterns for Eurostars Hotel Company, using advanced Big Data methods to clean, transform, and analyze large volumes of historical data. Based on the insights gained, a set of cancellation detection models is developed, and their performance is discussed from both technical and business perspectives. Once the best model is selected, it is compared to a classifier implemented by an external company. Finally, a visual tool is designed to assist industry staff in analyzing the predictions, enabling the development of effective strategies to mitigate the impact of cancellations and estimate the benefit provided by their application.
In the hotel industry, the uncertainty caused by booking cancellations makes it difficult to forecast availability in the establishment, leading to significant profit losses. This project conducts an in-depth study of the cancellation patterns for Eurostars Hotel Company, using advanced Big Data methods to clean, transform, and analyze large volumes of historical data. Based on the insights gained, a set of cancellation detection models is developed, and their performance is discussed from both technical and business perspectives. Once the best model is selected, it is compared to a classifier implemented by an external company. Finally, a visual tool is designed to assist industry staff in analyzing the predictions, enabling the development of effective strategies to mitigate the impact of cancellations and estimate the benefit provided by their application.
Direction
Sánchez Vila, Eduardo Manuel (Tutorships)
Comesaña García, Alejandra (Co-tutorships)
Freire Martínez, Ignacio (Co-tutorships)
Sánchez Vila, Eduardo Manuel (Tutorships)
Comesaña García, Alejandra (Co-tutorships)
Freire Martínez, Ignacio (Co-tutorships)
Court
Losada Carril, David Enrique (Chairman)
Blanco Heras, Dora (Secretary)
AMEIJEIRAS ALONSO, JOSE (Member)
Losada Carril, David Enrique (Chairman)
Blanco Heras, Dora (Secretary)
AMEIJEIRAS ALONSO, JOSE (Member)
Automation of data access through a data space connector.
Authorship
A.V.L.
Master in Masive Data Analisys Tecnologies: Big Data
A.V.L.
Master in Masive Data Analisys Tecnologies: Big Data
Defense date
07.15.2024 17:30
07.15.2024 17:30
Summary
This project aims to study how data spaces function and how to automate their use. This Master Thesis utilizes the connector developed by Eclipse. In order to automate the use of data spaces, a software has been developed. This software facilitates the access to a data space to any user willing to join it and upload data into it. This software must be compatible with the main database management systems used by companies, so that data sharing friction is minimal. Additionally, the automatic generation of definitions for the data that the actor would like to share, have been developed. Furthermore the definition of policies that can be applied to this data, and the contracts under which they will be offered within a data space have also been automated. These advances in the use of data spaces are truly beneficial for users who want to be part of them. This will make them more accessible and easier to use for any company that would like to benefit from their use. Simplifying access to a date space becomes even more necessary when taking into account that the EU is promoting these initiatives to achieve a functional European data space.
This project aims to study how data spaces function and how to automate their use. This Master Thesis utilizes the connector developed by Eclipse. In order to automate the use of data spaces, a software has been developed. This software facilitates the access to a data space to any user willing to join it and upload data into it. This software must be compatible with the main database management systems used by companies, so that data sharing friction is minimal. Additionally, the automatic generation of definitions for the data that the actor would like to share, have been developed. Furthermore the definition of policies that can be applied to this data, and the contracts under which they will be offered within a data space have also been automated. These advances in the use of data spaces are truly beneficial for users who want to be part of them. This will make them more accessible and easier to use for any company that would like to benefit from their use. Simplifying access to a date space becomes even more necessary when taking into account that the EU is promoting these initiatives to achieve a functional European data space.
Direction
VIDAL AGUIAR, JUAN CARLOS (Tutorships)
Reiriz Cores, Diego (Co-tutorships)
VIDAL AGUIAR, JUAN CARLOS (Tutorships)
Reiriz Cores, Diego (Co-tutorships)
Court
MUCIENTES MOLINA, MANUEL FELIPE (Chairman)
López Martínez, Paula (Secretary)
RIOS VIQUEIRA, JOSE RAMON (Member)
MUCIENTES MOLINA, MANUEL FELIPE (Chairman)
López Martínez, Paula (Secretary)
RIOS VIQUEIRA, JOSE RAMON (Member)
Development of a recommender system for administrative procedures using machine learning and artificial intelligence techniques on an advanced Cloud infrastructure
Authorship
A.O.Q.
Master in Masive Data Analisys Tecnologies: Big Data
A.O.Q.
Master in Masive Data Analisys Tecnologies: Big Data
Defense date
07.16.2024 09:30
07.16.2024 09:30
Summary
The implementation of recommendation systems has become an essential tool to optimize user experience and improve efficiency in various sectors. That is why this work focuses on the development of a recommender for administrative procedures using advanced machine learning and artificial intelligence techniques on a Cloud infrastructure. The aim is to facilitate and personalize the interaction of citizens with a specific regional public body, improving accessibility and efficiency in carrying out administrative procedures. To this end, various recommender systems are proposed, including collaborative filtering, content-based filtering, and hybrid approaches. These models are implemented using Python, taking advantage of the AWS Cloud infrastructure for agile and scalable development. The process is framed within the MLOps (Machine Learning Operations) methodology, guaranteeing continuous integration and constant monitoring of the system from its design to its deployment. The project is structured in several phases that include the understanding and preparation of the data, the construction and evaluation of the predictive models, and the selection and documentation of the final model. The data sources used come from three different platforms that record the interaction of citizens with administrative procedures, thus allowing effective personalization of recommendations. It should be noted that the data used is real and not simulated, which guarantees greater precision and relevance in the recommendations offered. This project not only seeks to improve the user experience by simplifying the search and completion of procedures, but also to promote transparency and trust in public administration.
The implementation of recommendation systems has become an essential tool to optimize user experience and improve efficiency in various sectors. That is why this work focuses on the development of a recommender for administrative procedures using advanced machine learning and artificial intelligence techniques on a Cloud infrastructure. The aim is to facilitate and personalize the interaction of citizens with a specific regional public body, improving accessibility and efficiency in carrying out administrative procedures. To this end, various recommender systems are proposed, including collaborative filtering, content-based filtering, and hybrid approaches. These models are implemented using Python, taking advantage of the AWS Cloud infrastructure for agile and scalable development. The process is framed within the MLOps (Machine Learning Operations) methodology, guaranteeing continuous integration and constant monitoring of the system from its design to its deployment. The project is structured in several phases that include the understanding and preparation of the data, the construction and evaluation of the predictive models, and the selection and documentation of the final model. The data sources used come from three different platforms that record the interaction of citizens with administrative procedures, thus allowing effective personalization of recommendations. It should be noted that the data used is real and not simulated, which guarantees greater precision and relevance in the recommendations offered. This project not only seeks to improve the user experience by simplifying the search and completion of procedures, but also to promote transparency and trust in public administration.
Direction
FELIX LAMAS, PAULO (Tutorships)
López Cabaleiros, Iván (Co-tutorships)
FELIX LAMAS, PAULO (Tutorships)
López Cabaleiros, Iván (Co-tutorships)
Court
Losada Carril, David Enrique (Chairman)
Blanco Heras, Dora (Secretary)
AMEIJEIRAS ALONSO, JOSE (Member)
Losada Carril, David Enrique (Chairman)
Blanco Heras, Dora (Secretary)
AMEIJEIRAS ALONSO, JOSE (Member)
Dynamic Injection of Galician Legal Knowledge into Large Language Models
Authorship
A.F.S.
Master in Masive Data Analisys Tecnologies: Big Data
A.F.S.
Master in Masive Data Analisys Tecnologies: Big Data
Defense date
09.13.2024 16:00
09.13.2024 16:00
Summary
Large language models represent a significant advancement in information retrieval and knowledge destillation for end users due to the vast amount of data they are trained on, and their capabilities for comprehension and reasoning in natural language. However, despite the extensive information they have learned, they may still have difficults with specific domain data, the comprehension of less commonly used languages, and updated data or new data that the model was not trained on. To address these issues, there are various methods such as inserting new external knowledge through input or retraining, allowing the models to process and infer responses based on this knowledge for tasks that require it. This work addresses the problem of a pre-trained large language model can learn knowledge of the legislative domain in Galician, exploring the capabilities of different models through both a context-based approach and an approach that involves injecting knowledge through parameter retraining, while also considering limitations such as the continuous change in the information contained in the data and the need for dynamic knowledge injection.
Large language models represent a significant advancement in information retrieval and knowledge destillation for end users due to the vast amount of data they are trained on, and their capabilities for comprehension and reasoning in natural language. However, despite the extensive information they have learned, they may still have difficults with specific domain data, the comprehension of less commonly used languages, and updated data or new data that the model was not trained on. To address these issues, there are various methods such as inserting new external knowledge through input or retraining, allowing the models to process and infer responses based on this knowledge for tasks that require it. This work addresses the problem of a pre-trained large language model can learn knowledge of the legislative domain in Galician, exploring the capabilities of different models through both a context-based approach and an approach that involves injecting knowledge through parameter retraining, while also considering limitations such as the continuous change in the information contained in the data and the need for dynamic knowledge injection.
Direction
LAMA PENIN, MANUEL (Tutorships)
VIDAL AGUIAR, JUAN CARLOS (Co-tutorships)
LAMA PENIN, MANUEL (Tutorships)
VIDAL AGUIAR, JUAN CARLOS (Co-tutorships)
Court
Fernández Pena, Anselmo Tomás (Chairman)
MOSQUERA GONZALEZ, ANTONIO (Secretary)
Triñanes Fernández, Joaquín Ángel (Member)
Fernández Pena, Anselmo Tomás (Chairman)
MOSQUERA GONZALEZ, ANTONIO (Secretary)
Triñanes Fernández, Joaquín Ángel (Member)
EfficientBaGAN: an efficient way to use a balancing GAN for forest mapping
Authorship
N.V.P.
Master in Masive Data Analisys Tecnologies: Big Data
N.V.P.
Master in Masive Data Analisys Tecnologies: Big Data
Defense date
07.15.2024 17:00
07.15.2024 17:00
Summary
Achieving high accuracies in the classification of multispectral remote sensing datasets by using deep learning techniques is a challenge as these datasets are often characterized by having limited labeled information and class imbalances. Different GAN-based methods as data augmentation tools have successfully alleviated these issues in recent years. Another problem is the high computational cost associated with deep learning methods. The EfficientNet architecture has emerged as an alternative to obtain high accuracies with moderate computational cost. To tackle these challenges, this work presents EfficientBaGAN, a method for the classification of multispectral remote sensing images based on the EfficientNet model. EfficientBaGAN aims to reduce the effect of data scarcity and class imbalances by leveraging the so-called BAGAN architecture but incorporating a residual EfficientNet-based generator and an also EfficientNet-based discriminator. Additionally, a superpixel-based sample extraction procedure is employed to further reduce the computational cost. Experiments were conducted on large, very high-resolution multispectral images of vegetation. EfficientBaGAN has achieved higher accuracy than other equally sophisticated classification methods, while presenting a lower computational cost, being more than twice as fast as these equally sophisticated classification methods.
Achieving high accuracies in the classification of multispectral remote sensing datasets by using deep learning techniques is a challenge as these datasets are often characterized by having limited labeled information and class imbalances. Different GAN-based methods as data augmentation tools have successfully alleviated these issues in recent years. Another problem is the high computational cost associated with deep learning methods. The EfficientNet architecture has emerged as an alternative to obtain high accuracies with moderate computational cost. To tackle these challenges, this work presents EfficientBaGAN, a method for the classification of multispectral remote sensing images based on the EfficientNet model. EfficientBaGAN aims to reduce the effect of data scarcity and class imbalances by leveraging the so-called BAGAN architecture but incorporating a residual EfficientNet-based generator and an also EfficientNet-based discriminator. Additionally, a superpixel-based sample extraction procedure is employed to further reduce the computational cost. Experiments were conducted on large, very high-resolution multispectral images of vegetation. EfficientBaGAN has achieved higher accuracy than other equally sophisticated classification methods, while presenting a lower computational cost, being more than twice as fast as these equally sophisticated classification methods.
Direction
Blanco Heras, Dora (Tutorships)
Argüello Pedreira, Francisco Santiago (Co-tutorships)
Blanco Heras, Dora (Tutorships)
Argüello Pedreira, Francisco Santiago (Co-tutorships)
Court
MUCIENTES MOLINA, MANUEL FELIPE (Chairman)
López Martínez, Paula (Secretary)
RIOS VIQUEIRA, JOSE RAMON (Member)
MUCIENTES MOLINA, MANUEL FELIPE (Chairman)
López Martínez, Paula (Secretary)
RIOS VIQUEIRA, JOSE RAMON (Member)
Performance analysis of OLAP systems for real-time analytics
Authorship
M.T.H.
Master in Masive Data Analisys Tecnologies: Big Data
M.T.H.
Master in Masive Data Analisys Tecnologies: Big Data
Defense date
07.15.2024 17:00
07.15.2024 17:00
Summary
This project explores the performance of OLAP (Online Analytical Processing) systems in real-time analytics applications. As companies seek to improve decision making by analyzing large volumes of data in real time, OLAP systems have become essential. ClickHouse, Apache Doris, and StarRocks systems are evaluated, highlighting their architectures, optimization methods, and scalability capabilities. The analysis is based on key metrics such as execution time, resource usage efficiency, and the ability to handle various volumes of data in different scenarios and use cases. The results show that,execution times are considerably lower on ClickHouse compared,to StarRocks and especially to Apache Doris. To achieve this level of performance, ClickHouse uses a greater amount of CPU processing power than the other systems, as well as more memory on queries involving JOINs.
This project explores the performance of OLAP (Online Analytical Processing) systems in real-time analytics applications. As companies seek to improve decision making by analyzing large volumes of data in real time, OLAP systems have become essential. ClickHouse, Apache Doris, and StarRocks systems are evaluated, highlighting their architectures, optimization methods, and scalability capabilities. The analysis is based on key metrics such as execution time, resource usage efficiency, and the ability to handle various volumes of data in different scenarios and use cases. The results show that,execution times are considerably lower on ClickHouse compared,to StarRocks and especially to Apache Doris. To achieve this level of performance, ClickHouse uses a greater amount of CPU processing power than the other systems, as well as more memory on queries involving JOINs.
Direction
Triñanes Fernández, Joaquín Ángel (Tutorships)
López Chao, Brais (Co-tutorships)
Triñanes Fernández, Joaquín Ángel (Tutorships)
López Chao, Brais (Co-tutorships)
Court
Argüello Pedreira, Francisco Santiago (Chairman)
LAMA PENIN, MANUEL (Secretary)
FELIX LAMAS, PAULO (Member)
Argüello Pedreira, Francisco Santiago (Chairman)
LAMA PENIN, MANUEL (Secretary)
FELIX LAMAS, PAULO (Member)