Evaluation of the Impact on Costs of Various Items in Cooperative Inventory Models with Multiple Agents
Authorship
E.D.G.
Master in Statistical Techniques
E.D.G.
Master in Statistical Techniques
Defense date
02.05.2025 10:30
02.05.2025 10:30
Summary
This work addresses the analysis of deterministic inventory models in the context of cooperative game theory, exploring their application to cost distribution problems. The fundamental concepts necessary for the study are presented, including an introduction to cooperative game theory, transferable utility (TU) games, and their main solutions, such as the core, the Shapley value, and the Owen value. Within the framework of inventory models, different configurations of the EOQ (Economic Order Quantity) model are analyzed, starting with the basic deterministic model and extending to cases involving multiple items and agents. Two key variants are examined: models with exemptible costs, which include constraints allowing certain coalitions to be excluded from cost contributions, and models without exemptible costs, where all coalitions fully participate. Throughout the work, an illustrative example is studied to highlight the implications of the models and their solutions in practice. Additionally, the impact of different cost distribution rules on the allocation among agents or items is analyzed, considering both fairness and system efficiency.
This work addresses the analysis of deterministic inventory models in the context of cooperative game theory, exploring their application to cost distribution problems. The fundamental concepts necessary for the study are presented, including an introduction to cooperative game theory, transferable utility (TU) games, and their main solutions, such as the core, the Shapley value, and the Owen value. Within the framework of inventory models, different configurations of the EOQ (Economic Order Quantity) model are analyzed, starting with the basic deterministic model and extending to cases involving multiple items and agents. Two key variants are examined: models with exemptible costs, which include constraints allowing certain coalitions to be excluded from cost contributions, and models without exemptible costs, where all coalitions fully participate. Throughout the work, an illustrative example is studied to highlight the implications of the models and their solutions in practice. Additionally, the impact of different cost distribution rules on the allocation among agents or items is analyzed, considering both fairness and system efficiency.
Direction
GARCIA JURADO, IGNACIO (Tutorships)
GARCIA JURADO, IGNACIO (Tutorships)
Court
AMEIJEIRAS ALONSO, JOSE (Coordinator)
Vidal Puga, Juan José (Chairman)
Oviedo de la Fuente, Manuel (Secretary)
PATEIRO LOPEZ, BEATRIZ (Member)
AMEIJEIRAS ALONSO, JOSE (Coordinator)
Vidal Puga, Juan José (Chairman)
Oviedo de la Fuente, Manuel (Secretary)
PATEIRO LOPEZ, BEATRIZ (Member)
Global Optimality Certification in Large-Scale Parameter Estimation in Dynamic Systems: Solving the 3SP Problem
Authorship
M.F.F.D.D.
Master in Statistical Techniques
M.F.F.D.D.
Master in Statistical Techniques
Defense date
06.18.2025 12:15
06.18.2025 12:15
Summary
In this work, we present a globally optimal solution to a challenging and large-scale parameter estimation problem in dynamic systems. The proposed approach builds upon the methodology developed in our previous work, reported in the article Parameter estimation in ODEs: assessing the potential of local and global solvers (De Dios et al., 2025, accepted in Optimization and Engineering), and is applied to the resolution of the 3SP problem formulated by Moles et al. (2003). According to the literature reviewed, this problem has not previously been addressed through a complete formulation combined with a certified globally optimal solution. Our methodology is based on formulating the problem as a nonlinear programming (NLP) model, solved using mathematical optimization techniques implemented in AMPL and executed with deterministic global solvers. This approach makes it possible to obtain solutions with a global optimality guarantee in a context where heuristic methods have traditionally been used. The results show that it is feasible to tackle large-scale problems using global optimization techniques based on AMPL and current deterministic solvers. This challenges previously assumed limitations in the literature and opens the door to solving even more complex problems
In this work, we present a globally optimal solution to a challenging and large-scale parameter estimation problem in dynamic systems. The proposed approach builds upon the methodology developed in our previous work, reported in the article Parameter estimation in ODEs: assessing the potential of local and global solvers (De Dios et al., 2025, accepted in Optimization and Engineering), and is applied to the resolution of the 3SP problem formulated by Moles et al. (2003). According to the literature reviewed, this problem has not previously been addressed through a complete formulation combined with a certified globally optimal solution. Our methodology is based on formulating the problem as a nonlinear programming (NLP) model, solved using mathematical optimization techniques implemented in AMPL and executed with deterministic global solvers. This approach makes it possible to obtain solutions with a global optimality guarantee in a context where heuristic methods have traditionally been used. The results show that it is feasible to tackle large-scale problems using global optimization techniques based on AMPL and current deterministic solvers. This challenges previously assumed limitations in the literature and opens the door to solving even more complex problems
Direction
GONZALEZ DIAZ, JULIO (Tutorships)
GONZALEZ DIAZ, JULIO (Tutorships)
Court
AMEIJEIRAS ALONSO, JOSE (Coordinator)
Vidal Puga, Juan José (Chairman)
Oviedo de la Fuente, Manuel (Secretary)
PATEIRO LOPEZ, BEATRIZ (Member)
AMEIJEIRAS ALONSO, JOSE (Coordinator)
Vidal Puga, Juan José (Chairman)
Oviedo de la Fuente, Manuel (Secretary)
PATEIRO LOPEZ, BEATRIZ (Member)
Pricing with rough volatility models in commodity markets
Authorship
H.F.C.
Master in Industrial Mathematics
H.F.C.
Master in Industrial Mathematics
Defense date
07.09.2025 12:00
07.09.2025 12:00
Summary
The objective of this Master's Thesis is to investigate the applicability of the rough Bergomi (rBergomi) model to pricing futures options on commodities. The rBergomi model is a stochastic volatility model driven by a fractional Brownian motion. It has been extensively studied in the context of equity markets, however, the literature on its application in commodity markets is scarce. We first review the mathematical foundations of asset pricing and the rBergomi model, analysing its advantages and disadvantages compared to classical stochastic volatility models. We then propose a novel model for pricing futures options on commodities that implements the rBergomi dynamics. Finally, we present an efficient numerical scheme for simulating the model and calibrate it to real market data on WTI Crude Oil.
The objective of this Master's Thesis is to investigate the applicability of the rough Bergomi (rBergomi) model to pricing futures options on commodities. The rBergomi model is a stochastic volatility model driven by a fractional Brownian motion. It has been extensively studied in the context of equity markets, however, the literature on its application in commodity markets is scarce. We first review the mathematical foundations of asset pricing and the rBergomi model, analysing its advantages and disadvantages compared to classical stochastic volatility models. We then propose a novel model for pricing futures options on commodities that implements the rBergomi dynamics. Finally, we present an efficient numerical scheme for simulating the model and calibrate it to real market data on WTI Crude Oil.
Direction
Vázquez Cendón, Carlos (Tutorships)
Vázquez Cendón, Carlos (Tutorships)
Court
VAZQUEZ CENDON, MARIA ELENA (Coordinator)
Varas Mérida, Fernando (Chairman)
Terragni , Filippo (Secretary)
López Pouso, Óscar (Member)
VAZQUEZ CENDON, MARIA ELENA (Coordinator)
Varas Mérida, Fernando (Chairman)
Terragni , Filippo (Secretary)
López Pouso, Óscar (Member)
Simulation of evaporation, transfer and condensation conditions in a reactor.
Authorship
J.G.P.
Master in Industrial Mathematics
J.G.P.
Master in Industrial Mathematics
Defense date
07.09.2025 10:00
07.09.2025 10:00
Summary
This master’s thesis focuses on the simulation of an innovative industrial process driven by the company Ferroglobe. The process takes place in a vacuum-operated reactor, where an electric field is applied to generate heat. This heating allows the necessary conditions to be reached for the reactants inside to produce a high-value metallurgical product. The product is formed in the gas phase, transported along the reactor, and condensed at the top, where a condenser is located. The main objective of the project is to understand and optimize this process using mathematical modeling and numerical simulation techniques. All simulations were carried out using ANSYS Fluent software, which is taught in the Master’s in Industrial Mathematics. With the reactor simulation, the company will gain a better understanding of its operation, facilitating the evaluation of geometric modifications, identification of zones at risk of condensation, and scaling of the process, thus reducing the need for experimental testing.
This master’s thesis focuses on the simulation of an innovative industrial process driven by the company Ferroglobe. The process takes place in a vacuum-operated reactor, where an electric field is applied to generate heat. This heating allows the necessary conditions to be reached for the reactants inside to produce a high-value metallurgical product. The product is formed in the gas phase, transported along the reactor, and condensed at the top, where a condenser is located. The main objective of the project is to understand and optimize this process using mathematical modeling and numerical simulation techniques. All simulations were carried out using ANSYS Fluent software, which is taught in the Master’s in Industrial Mathematics. With the reactor simulation, the company will gain a better understanding of its operation, facilitating the evaluation of geometric modifications, identification of zones at risk of condensation, and scaling of the process, thus reducing the need for experimental testing.
Direction
GOMEZ PEDREIRA, MARIA DOLORES (Tutorships)
GOMEZ PEDREIRA, MARIA DOLORES (Tutorships)
Court
VAZQUEZ CENDON, MARIA ELENA (Coordinator)
Varas Mérida, Fernando (Chairman)
Terragni , Filippo (Secretary)
López Pouso, Óscar (Member)
VAZQUEZ CENDON, MARIA ELENA (Coordinator)
Varas Mérida, Fernando (Chairman)
Terragni , Filippo (Secretary)
López Pouso, Óscar (Member)
Filament estimation
Authorship
H.G.V.
Master in Statistical Techniques
H.G.V.
Master in Statistical Techniques
Defense date
02.05.2025 09:30
02.05.2025 09:30
Summary
Manifold estimation allows a non-linear and non-parametric dimension reduction when working with data in an euclidean space that are actually supported on (or close to) a lower dimension manifold, providing a better understanding on their underlying structure. In the particular case when the manifold is a curve, the problem is known as filament estimation. The aim of this work is to propose a new filament estimator that achieves the optimal rate in minimax sense of convergence in Hausdorff distance, up to logarithmic factor, when the ambient space is the plane. First, an introduction on concepts, shape conditions and estimators used in set estimation is presented. Next, the so-called EDT (Euclidean Distance Transform) estimator, in a filament estimation model with additive noise, is revised. A perpendicular noise model, in a more general manifold estimation context, in which the minimax rate is known, is also presented. Lastly, the new estimator, called the EDT estimator with r-convex hull, is proposed, and its convergence rate is obtained. We also study a possible choice on the shape parameter r from the data without affecting the convergence rate. The proposed estimator is applied to a tree stem cross section estimation problem in forest inventory.
Manifold estimation allows a non-linear and non-parametric dimension reduction when working with data in an euclidean space that are actually supported on (or close to) a lower dimension manifold, providing a better understanding on their underlying structure. In the particular case when the manifold is a curve, the problem is known as filament estimation. The aim of this work is to propose a new filament estimator that achieves the optimal rate in minimax sense of convergence in Hausdorff distance, up to logarithmic factor, when the ambient space is the plane. First, an introduction on concepts, shape conditions and estimators used in set estimation is presented. Next, the so-called EDT (Euclidean Distance Transform) estimator, in a filament estimation model with additive noise, is revised. A perpendicular noise model, in a more general manifold estimation context, in which the minimax rate is known, is also presented. Lastly, the new estimator, called the EDT estimator with r-convex hull, is proposed, and its convergence rate is obtained. We also study a possible choice on the shape parameter r from the data without affecting the convergence rate. The proposed estimator is applied to a tree stem cross section estimation problem in forest inventory.
Direction
PATEIRO LOPEZ, BEATRIZ (Tutorships)
RODRIGUEZ CASAL, ALBERTO (Co-tutorships)
PATEIRO LOPEZ, BEATRIZ (Tutorships)
RODRIGUEZ CASAL, ALBERTO (Co-tutorships)
Court
AMEIJEIRAS ALONSO, JOSE (Coordinator)
Bergantiños Cid, Gustavo (Chairman)
GINZO VILLAMAYOR, MARIA JOSE (Secretary)
Darriba López, Diego (Member)
AMEIJEIRAS ALONSO, JOSE (Coordinator)
Bergantiños Cid, Gustavo (Chairman)
GINZO VILLAMAYOR, MARIA JOSE (Secretary)
Darriba López, Diego (Member)
Development of Automatic Classification Models for Digital Documents using Transformers
Authorship
M.G.H.S.
Master in Statistical Techniques
M.G.H.S.
Master in Statistical Techniques
Defense date
02.05.2025 09:45
02.05.2025 09:45
Summary
Managing large volumes of documents presents a significant challenge for organizations, as manually classifying and processing them is inefficient and results in a waste of resources. While traditional approaches may be necessary in certain contexts, they limit the ability to quickly access and effectively utilize information. In response to this situation, various technological solutions are evolving to facilitate the efficient organization and access to documents. The first part of this work presents the development of a text classification model based on Transformers, an advanced natural language processing (NLP) architecture. The model automates the document classification process, which not only improves organizational efficiency but also enables faster and more effective use of the documents afterward. This approach, leveraging pre-trained models like BERT, takes advantage of their adaptability to specific tasks, making it possible to efficiently classify large volumes of data and enhance quick, accurate access to relevant information. In doing so, it contributes to resource optimization and better information management within the organization. To validate the model's effectiveness, it was compared to traditional classifiers such as kNN, Naïve Bayes, and Random Forest, using the same training data. In all cases, the BERT-based model demonstrated superior generalization capabilities, showing remarkable performance when classifying documents on topics none of the classifiers had encountered during training, and outperforming traditional techniques on the analyzed datasets. This highlights its advantage in adapting to new contexts and document types without requiring significant reconfiguration or adjustments. BERT's architecture allows it to understand the context and deep meaning of the text, providing flexibility in handling a wide variety of tasks, even when faced with data that does not perfectly align with the training examples. This adaptive capability makes BERT an ideal solution for environments where data and needs are constantly evolving, allowing for greater efficiency and accuracy in both document classification and information retrieval. The second part of this work is focused on developing a model for information retrieval and generation. This model serves as an initial proposal aimed at facilitating access to information from various sources, adding significant value to the organization's operational processes and optimizing the use of available data for decision-making. The model has been evaluated using a dataset extracted from the Huggingface platform. The results show that the generated responses achieved a cosine similarity of over 60% compared to the expected responses when the provided context was relevant to the question, suggesting a high degree of content alignment. This validates the model's ability to generate coherent and relevant responses in scenarios where context is key.
Managing large volumes of documents presents a significant challenge for organizations, as manually classifying and processing them is inefficient and results in a waste of resources. While traditional approaches may be necessary in certain contexts, they limit the ability to quickly access and effectively utilize information. In response to this situation, various technological solutions are evolving to facilitate the efficient organization and access to documents. The first part of this work presents the development of a text classification model based on Transformers, an advanced natural language processing (NLP) architecture. The model automates the document classification process, which not only improves organizational efficiency but also enables faster and more effective use of the documents afterward. This approach, leveraging pre-trained models like BERT, takes advantage of their adaptability to specific tasks, making it possible to efficiently classify large volumes of data and enhance quick, accurate access to relevant information. In doing so, it contributes to resource optimization and better information management within the organization. To validate the model's effectiveness, it was compared to traditional classifiers such as kNN, Naïve Bayes, and Random Forest, using the same training data. In all cases, the BERT-based model demonstrated superior generalization capabilities, showing remarkable performance when classifying documents on topics none of the classifiers had encountered during training, and outperforming traditional techniques on the analyzed datasets. This highlights its advantage in adapting to new contexts and document types without requiring significant reconfiguration or adjustments. BERT's architecture allows it to understand the context and deep meaning of the text, providing flexibility in handling a wide variety of tasks, even when faced with data that does not perfectly align with the training examples. This adaptive capability makes BERT an ideal solution for environments where data and needs are constantly evolving, allowing for greater efficiency and accuracy in both document classification and information retrieval. The second part of this work is focused on developing a model for information retrieval and generation. This model serves as an initial proposal aimed at facilitating access to information from various sources, adding significant value to the organization's operational processes and optimizing the use of available data for decision-making. The model has been evaluated using a dataset extracted from the Huggingface platform. The results show that the generated responses achieved a cosine similarity of over 60% compared to the expected responses when the provided context was relevant to the question, suggesting a high degree of content alignment. This validates the model's ability to generate coherent and relevant responses in scenarios where context is key.
Direction
LÓPEZ TABOADA, GUILLERMO (Tutorships)
LÓPEZ TABOADA, GUILLERMO (Tutorships)
Court
AMEIJEIRAS ALONSO, JOSE (Coordinator)
Vidal Puga, Juan José (Chairman)
Oviedo de la Fuente, Manuel (Secretary)
PATEIRO LOPEZ, BEATRIZ (Member)
AMEIJEIRAS ALONSO, JOSE (Coordinator)
Vidal Puga, Juan José (Chairman)
Oviedo de la Fuente, Manuel (Secretary)
PATEIRO LOPEZ, BEATRIZ (Member)
Construction of a Leading Indicator of the Unemployment
Authorship
L.M.G.
Master in Statistical Techniques
L.M.G.
Master in Statistical Techniques
Defense date
06.18.2025 10:45
06.18.2025 10:45
Summary
The unemployment rate is a key variable for analyzing the evolution of the labor market in Spain, but it is published quarterly with a lag of one and a half months after the end of the quarter in question. Therefore, the Strategic Planning and PMO Department of ABANCA proposes the construction of a leading indicator of the unemployment rate, estimated monthly from the data on Social Security affiliation and registered unemployment, available at the beginning of each month. To this end, using temporal disaggregation models, the data on employed and unemployed people are monthlyzed using the monthly series of affiliations and registered unemployment. From these estimates, a monthly unemployment rate is calculated, which is then converted to quarterly frequency, allowing its comparison with the official data from the Labor Force Survey (LFS). In addition, a methodology for correcting seasonality and calendar effects is incorporated, which is essential for the analysis of time series, since the official series are not corrected for the impact of these factors. Once these effects have been canceled, a more detailed analysis and comparisons are feasible, which, without this adjustment, would be inadequate.
The unemployment rate is a key variable for analyzing the evolution of the labor market in Spain, but it is published quarterly with a lag of one and a half months after the end of the quarter in question. Therefore, the Strategic Planning and PMO Department of ABANCA proposes the construction of a leading indicator of the unemployment rate, estimated monthly from the data on Social Security affiliation and registered unemployment, available at the beginning of each month. To this end, using temporal disaggregation models, the data on employed and unemployed people are monthlyzed using the monthly series of affiliations and registered unemployment. From these estimates, a monthly unemployment rate is calculated, which is then converted to quarterly frequency, allowing its comparison with the official data from the Labor Force Survey (LFS). In addition, a methodology for correcting seasonality and calendar effects is incorporated, which is essential for the analysis of time series, since the official series are not corrected for the impact of these factors. Once these effects have been canceled, a more detailed analysis and comparisons are feasible, which, without this adjustment, would be inadequate.
Direction
Vilar Fernández, José Antonio (Tutorships)
Vilar Fernández, José Antonio (Tutorships)
Court
AMEIJEIRAS ALONSO, JOSE (Coordinator)
Vidal Puga, Juan José (Chairman)
Oviedo de la Fuente, Manuel (Secretary)
PATEIRO LOPEZ, BEATRIZ (Member)
AMEIJEIRAS ALONSO, JOSE (Coordinator)
Vidal Puga, Juan José (Chairman)
Oviedo de la Fuente, Manuel (Secretary)
PATEIRO LOPEZ, BEATRIZ (Member)
Exploration of methods to estimate the uncertainty of fishing discards using RDBES format
Authorship
J.M.M.C.
Master in Statistical Techniques
J.M.M.C.
Master in Statistical Techniques
Defense date
06.18.2025 11:45
06.18.2025 11:45
Summary
Accurate estimation of discards is essential for effective fish stock management, as discard data are part of the fishing mortality. Failure to do so could seriously compromise the long-term sustainability of fish stocks. However, the accuracy of discard estimates is affected by various factors, from sampling design of the sampling programs to the estimation process. Sampling experts have expressed concern about the paucity of data quality indicators in discard estimates, particularly when data from nonprobability survey methods need to be used as proxies for probability data. As a case study, we use data from observers on board the Spanish trawl fleet targeting demersal fish in ICES subarea 7. Sampling follows a multistage design, from vessel selection to the collection of the sample of discarded catch during each haul. To obtain the most accurate discard estimates, it is necessary to properly specify the successive methods for selecting sampling units, as well as to correctly parameterize the variance calculation at each sampling step. For this purpose, a two-stage sampling design was employed. In the first stage, fishing trips were selected using simple random sampling without replacement (SRSWOR), while in the second stage, fishing operations were selected using systematic sampling. Two complementary approaches were adopted to estimate the variance. First, the classical SRSWOR estimate was used, and for SYSS, the approximations obtained using the method of successive differences and the estimate based on the SRSWOR assumption were compared. Second, resampling techniques, particularly the bootstrap method, were implemented to assess uncertainty in a more flexible and robust manner.
Accurate estimation of discards is essential for effective fish stock management, as discard data are part of the fishing mortality. Failure to do so could seriously compromise the long-term sustainability of fish stocks. However, the accuracy of discard estimates is affected by various factors, from sampling design of the sampling programs to the estimation process. Sampling experts have expressed concern about the paucity of data quality indicators in discard estimates, particularly when data from nonprobability survey methods need to be used as proxies for probability data. As a case study, we use data from observers on board the Spanish trawl fleet targeting demersal fish in ICES subarea 7. Sampling follows a multistage design, from vessel selection to the collection of the sample of discarded catch during each haul. To obtain the most accurate discard estimates, it is necessary to properly specify the successive methods for selecting sampling units, as well as to correctly parameterize the variance calculation at each sampling step. For this purpose, a two-stage sampling design was employed. In the first stage, fishing trips were selected using simple random sampling without replacement (SRSWOR), while in the second stage, fishing operations were selected using systematic sampling. Two complementary approaches were adopted to estimate the variance. First, the classical SRSWOR estimate was used, and for SYSS, the approximations obtained using the method of successive differences and the estimate based on the SRSWOR assumption were compared. Second, resampling techniques, particularly the bootstrap method, were implemented to assess uncertainty in a more flexible and robust manner.
Direction
Mosquera Rodríguez, Manuel Alfredo (Tutorships)
Mosquera Rodríguez, Manuel Alfredo (Tutorships)
Court
AMEIJEIRAS ALONSO, JOSE (Coordinator)
Bergantiños Cid, Gustavo (Chairman)
GINZO VILLAMAYOR, MARIA JOSE (Secretary)
Darriba López, Diego (Member)
AMEIJEIRAS ALONSO, JOSE (Coordinator)
Bergantiños Cid, Gustavo (Chairman)
GINZO VILLAMAYOR, MARIA JOSE (Secretary)
Darriba López, Diego (Member)
SorterAssignmentOptimization
Authorship
A.M.A.
Master in Statistical Techniques
A.M.A.
Master in Statistical Techniques
Defense date
06.18.2025 09:45
06.18.2025 09:45
Summary
This work aims to improve the efficiency of a warehouse sorter in the textile industry through the optimal assignment of tasks to its workers. Various garments are fed into the sorter from the warehouse, a process where two work roles are distinguished. Upon reaching a specific destination, the product falls into the box below it. Once they are full, the boxes are evacuated by humans, with two other possible work roles. The decision-making problem will consist of how many people will occupy each of the four work roles, in order to satisfy the demand, generated by the needs of a set of stores, in the shortest possible time. This text deeply describes the problem and formulates an integer linear programming model to solve it. After verifying that this model will not be able to solve realistic-sized instances, and observing that the total number of worker-role assignments is much lower than expected, a simulator is built that more faithfully recreates the sorter and evaluates the possible assignments, returning the best one as the optimal solution.
This work aims to improve the efficiency of a warehouse sorter in the textile industry through the optimal assignment of tasks to its workers. Various garments are fed into the sorter from the warehouse, a process where two work roles are distinguished. Upon reaching a specific destination, the product falls into the box below it. Once they are full, the boxes are evacuated by humans, with two other possible work roles. The decision-making problem will consist of how many people will occupy each of the four work roles, in order to satisfy the demand, generated by the needs of a set of stores, in the shortest possible time. This text deeply describes the problem and formulates an integer linear programming model to solve it. After verifying that this model will not be able to solve realistic-sized instances, and observing that the total number of worker-role assignments is much lower than expected, a simulator is built that more faithfully recreates the sorter and evaluates the possible assignments, returning the best one as the optimal solution.
Direction
GONZALEZ DIAZ, JULIO (Tutorships)
GONZALEZ DIAZ, JULIO (Tutorships)
Court
AMEIJEIRAS ALONSO, JOSE (Coordinator)
Bergantiños Cid, Gustavo (Chairman)
GINZO VILLAMAYOR, MARIA JOSE (Secretary)
Darriba López, Diego (Member)
AMEIJEIRAS ALONSO, JOSE (Coordinator)
Bergantiños Cid, Gustavo (Chairman)
GINZO VILLAMAYOR, MARIA JOSE (Secretary)
Darriba López, Diego (Member)
Automation of the Fault Detection Process in Buoy Oven Tests
Authorship
A.G.P.M.
Master in Industrial Mathematics
A.G.P.M.
Master in Industrial Mathematics
Defense date
07.09.2025 12:30
07.09.2025 12:30
Summary
Marine Instruments is a leading company in the design and manufacture of electronic equipment for marine environments. Its reputation is based on the high reliability of its products, ensured by a rigorous production process that concludes with exhaustive quality control. For buoys, this control includes an oven test: printed circuit boards (PCBs) are subjected to 60 degrees while running a standardized test program, and their power consumption is continuously recorded. Currently, validation is performed manually through visual inspection of the consumption curves to identify possible anomalies. This work proposes automating that classification phase by extracting and modeling characteristic fault features and designing a detection algorithm that combines data analytics and artificial intelligence techniques. The aim is to reduce inspection times, improve process traceability, and further increase the reliability of the final product.
Marine Instruments is a leading company in the design and manufacture of electronic equipment for marine environments. Its reputation is based on the high reliability of its products, ensured by a rigorous production process that concludes with exhaustive quality control. For buoys, this control includes an oven test: printed circuit boards (PCBs) are subjected to 60 degrees while running a standardized test program, and their power consumption is continuously recorded. Currently, validation is performed manually through visual inspection of the consumption curves to identify possible anomalies. This work proposes automating that classification phase by extracting and modeling characteristic fault features and designing a detection algorithm that combines data analytics and artificial intelligence techniques. The aim is to reduce inspection times, improve process traceability, and further increase the reliability of the final product.
Direction
VAZQUEZ CENDON, MARIA ELENA (Tutorships)
VAZQUEZ CENDON, MARIA ELENA (Tutorships)
Court
VAZQUEZ CENDON, MARIA ELENA (Coordinator)
Varas Mérida, Fernando (Chairman)
Terragni , Filippo (Secretary)
López Pouso, Óscar (Member)
VAZQUEZ CENDON, MARIA ELENA (Coordinator)
Varas Mérida, Fernando (Chairman)
Terragni , Filippo (Secretary)
López Pouso, Óscar (Member)
Comparison of global optimization methods for parameter estimation in biochemical networks.
Authorship
A.P.R.
Master in Industrial Mathematics
A.P.R.
Master in Industrial Mathematics
Defense date
01.24.2025 10:00
01.24.2025 10:00
Summary
This study evaluates the performance of various global optimization methods for parameter estimation in biochemical networks, a critical task in Computational Systems Biology. Deterministic and stochastic algorithms are compared using a set of standard optimization problems and four benchmark challenges specific to Systems Biology (the BioPreDyn benchmark set). The goal is to identify the most effective and reliable methods for addressing the non-convex optimization problems that frequently arise in this field. The study’s most significant finding is that the “enhanced Scatter Search” (eSS) method demonstrated the highest reliability in solving Systems Biology optimization problems among the tested metaheuristics. While no single algorithm excelled in all cases, eSS consistently achieved the greatest reduction in the objective function value and demonstrated superior robustness overall. Deterministic methods proved unsuitable for large-scale problems, highlighting their limitations in such contexts. This study highlights the importance of selecting appropriate optimization algorithms for parameter estimation in modeling biochemical networks. It further emphasizes the efficacy of certain metaheuristics in addressing the complex optimization problems that arise in Systems Biology.
This study evaluates the performance of various global optimization methods for parameter estimation in biochemical networks, a critical task in Computational Systems Biology. Deterministic and stochastic algorithms are compared using a set of standard optimization problems and four benchmark challenges specific to Systems Biology (the BioPreDyn benchmark set). The goal is to identify the most effective and reliable methods for addressing the non-convex optimization problems that frequently arise in this field. The study’s most significant finding is that the “enhanced Scatter Search” (eSS) method demonstrated the highest reliability in solving Systems Biology optimization problems among the tested metaheuristics. While no single algorithm excelled in all cases, eSS consistently achieved the greatest reduction in the objective function value and demonstrated superior robustness overall. Deterministic methods proved unsuitable for large-scale problems, highlighting their limitations in such contexts. This study highlights the importance of selecting appropriate optimization algorithms for parameter estimation in modeling biochemical networks. It further emphasizes the efficacy of certain metaheuristics in addressing the complex optimization problems that arise in Systems Biology.
Direction
López Pouso, Óscar (Tutorships)
López Pouso, Óscar (Tutorships)
Court
VAZQUEZ CENDON, MARIA ELENA (Coordinator)
VAZQUEZ CENDON, MARIA ELENA (Chairman)
Carretero Cerrajero, Manuel (Secretary)
ARREGUI ALVAREZ, IÑIGO (Member)
VAZQUEZ CENDON, MARIA ELENA (Coordinator)
VAZQUEZ CENDON, MARIA ELENA (Chairman)
Carretero Cerrajero, Manuel (Secretary)
ARREGUI ALVAREZ, IÑIGO (Member)
Impact of the sales period on sales forecasting in the textile sector
Authorship
M.R.B.
Master in Statistical Techniques
M.R.B.
Master in Statistical Techniques
Defense date
06.18.2025 10:00
06.18.2025 10:00
Summary
Efficient management of sales periods has become increasingly important in the commercial strategy of the textile sector, where balancing stock liquidity and profitability is essential. In this context, INDITEX has reinforced its leadership not only through its agile market response, but also through its control of inventory management and pricing during sales campaigns. Traditionally scheduled after periods of high demand, sales help free up space for new collections, but their optimal planning poses significant challenges. This creates the need to anticipate the impact that different discount levels may have on sales, especially in an environment where margins and customer perception are critical variables. This work addresses this challenge through an in-depth analysis of historical data and the development of predictive models capable of estimating sales performance under various pricing scenarios. The main goal is to provide INDITEX with a tool to support informed decision-making regarding discounts, thereby optimizing commercial performance during the sales period and strengthening its strategic position in the global fashion market.
Efficient management of sales periods has become increasingly important in the commercial strategy of the textile sector, where balancing stock liquidity and profitability is essential. In this context, INDITEX has reinforced its leadership not only through its agile market response, but also through its control of inventory management and pricing during sales campaigns. Traditionally scheduled after periods of high demand, sales help free up space for new collections, but their optimal planning poses significant challenges. This creates the need to anticipate the impact that different discount levels may have on sales, especially in an environment where margins and customer perception are critical variables. This work addresses this challenge through an in-depth analysis of historical data and the development of predictive models capable of estimating sales performance under various pricing scenarios. The main goal is to provide INDITEX with a tool to support informed decision-making regarding discounts, thereby optimizing commercial performance during the sales period and strengthening its strategic position in the global fashion market.
Direction
AMEIJEIRAS ALONSO, JOSE (Tutorships)
GINZO VILLAMAYOR, MARIA JOSE (Co-tutorships)
AMEIJEIRAS ALONSO, JOSE (Tutorships)
GINZO VILLAMAYOR, MARIA JOSE (Co-tutorships)
Court
AMEIJEIRAS ALONSO, JOSE (Coordinator)
Vidal Puga, Juan José (Chairman)
Oviedo de la Fuente, Manuel (Secretary)
PATEIRO LOPEZ, BEATRIZ (Member)
AMEIJEIRAS ALONSO, JOSE (Coordinator)
Vidal Puga, Juan José (Chairman)
Oviedo de la Fuente, Manuel (Secretary)
PATEIRO LOPEZ, BEATRIZ (Member)
Strategic Games: Matrix games and their relation to linear programming
Authorship
A.R.O.
Master in Statistical Techniques
A.R.O.
Master in Statistical Techniques
Defense date
02.05.2025 11:30
02.05.2025 11:30
Summary
This Master's Thesis (TFM) explores game theory, with a focus on strategic games and matrix games, and their relationship with linear programming. Game theory, a mathematical discipline that models strategic decision-making among agents, is analyzed here from both theoretical and practical perspectives, emphasizing competitive and cooperative interactions. The study focuses on matrix games, a fundamental representation of strategic games where strategies and payoffs are organized in a matrix format. Through linear programming, optimization problems associated with these games are addressed, such as finding optimal mixed strategies that maximize players' gains or minimize their losses. The thesis includes practical examples, highlighting the resolution of a matrix-form game using the simplex method, a powerful tool in linear programming. Among the cases analyzed, a real-world problem related to decision-making under uncertainty is examined, demonstrating how game theory and linear programming provide efficient solutions.
This Master's Thesis (TFM) explores game theory, with a focus on strategic games and matrix games, and their relationship with linear programming. Game theory, a mathematical discipline that models strategic decision-making among agents, is analyzed here from both theoretical and practical perspectives, emphasizing competitive and cooperative interactions. The study focuses on matrix games, a fundamental representation of strategic games where strategies and payoffs are organized in a matrix format. Through linear programming, optimization problems associated with these games are addressed, such as finding optimal mixed strategies that maximize players' gains or minimize their losses. The thesis includes practical examples, highlighting the resolution of a matrix-form game using the simplex method, a powerful tool in linear programming. Among the cases analyzed, a real-world problem related to decision-making under uncertainty is examined, demonstrating how game theory and linear programming provide efficient solutions.
Direction
GARCIA JURADO, IGNACIO (Tutorships)
GARCIA JURADO, IGNACIO (Tutorships)
Court
AMEIJEIRAS ALONSO, JOSE (Coordinator)
Bergantiños Cid, Gustavo (Chairman)
GINZO VILLAMAYOR, MARIA JOSE (Secretary)
Darriba López, Diego (Member)
AMEIJEIRAS ALONSO, JOSE (Coordinator)
Bergantiños Cid, Gustavo (Chairman)
GINZO VILLAMAYOR, MARIA JOSE (Secretary)
Darriba López, Diego (Member)
Econometric study from a social point of view on the efficiency of water management in the public and private sector.
Authorship
P.S.G.
Master in Statistical Techniques
P.S.G.
Master in Statistical Techniques
Defense date
02.05.2025 09:00
02.05.2025 09:00
Summary
Over the years, the choice between public or private management of water resources has been, and continues to be, a subject of debate in many countries, regardless of the particular legislation of each one. There are arguments on both sides of the discussion, and we can divide them into three fundamental categories to choose which administration has a better performance: tariff price, water quality and management efficiency. In this dissertation, we will summarise some of the contributions of the existing literature on this subject, focusing on the ones which based their empirical analysis on Spain, since it will be similar for ours. Moreover, we will focus on the evaluation of the efficiency between both managements, using Data Envelopment Analysis. Thus, after a brief description of this technique and its main models, we will apply it to our particular case: a comparison between wastewater treatment plants' eficiency according to whether they are managed by public or private companies, we will focus on Viaqua Gestión Integral De Aguas De Galicia, S.A as our private business. Once we have these results, we will be able to compare the performance of WWTPs on the basis of an efficiency measure, so we will see which ones have a worse performance. Finally, we will perform a benchmarking analysis and a clustering technique, so we will be able to clasificate WWTPs in grups with similar characteristics and to study which ones are benchmarks for the others.
Over the years, the choice between public or private management of water resources has been, and continues to be, a subject of debate in many countries, regardless of the particular legislation of each one. There are arguments on both sides of the discussion, and we can divide them into three fundamental categories to choose which administration has a better performance: tariff price, water quality and management efficiency. In this dissertation, we will summarise some of the contributions of the existing literature on this subject, focusing on the ones which based their empirical analysis on Spain, since it will be similar for ours. Moreover, we will focus on the evaluation of the efficiency between both managements, using Data Envelopment Analysis. Thus, after a brief description of this technique and its main models, we will apply it to our particular case: a comparison between wastewater treatment plants' eficiency according to whether they are managed by public or private companies, we will focus on Viaqua Gestión Integral De Aguas De Galicia, S.A as our private business. Once we have these results, we will be able to compare the performance of WWTPs on the basis of an efficiency measure, so we will see which ones have a worse performance. Finally, we will perform a benchmarking analysis and a clustering technique, so we will be able to clasificate WWTPs in grups with similar characteristics and to study which ones are benchmarks for the others.
Direction
GINZO VILLAMAYOR, MARIA JOSE (Tutorships)
SAAVEDRA NIEVES, ALEJANDRO (Co-tutorships)
GINZO VILLAMAYOR, MARIA JOSE (Tutorships)
SAAVEDRA NIEVES, ALEJANDRO (Co-tutorships)
Court
AMEIJEIRAS ALONSO, JOSE (Coordinator)
Vidal Puga, Juan José (Chairman)
Oviedo de la Fuente, Manuel (Secretary)
PATEIRO LOPEZ, BEATRIZ (Member)
AMEIJEIRAS ALONSO, JOSE (Coordinator)
Vidal Puga, Juan José (Chairman)
Oviedo de la Fuente, Manuel (Secretary)
PATEIRO LOPEZ, BEATRIZ (Member)