Browsing by Subject "Transformer"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
Item Impact in Substation Power Transformer Due to Injection of Non-linear Load(I.O.E. Pulchowk Campus, 2022-09) Tiwari, SujitIn electrical power system transformer is consider as one of the mostly used major component. The thermal condition of transformer is considered as a critical case while operating in abnormal conditions such as harmonic loading is the main focus of the study. In this thesis, 22.5 MVA, 66/11 kV, ONAF power transformer thermal behavior analysis is carried by using IEEE standard C57.91 model for the study of influence due to nonlinear load and impact on its life which is simulated by using MATLAB software and analyzed. The transformer considered in the study shows the significant life reduction while non-linear loads is injected above 15% of the transformer capacity rating for both winter and summer season. It concluded that higher the harmonic load current there will be additional losses towards transformer windings along with core which cause raise in temperature of transformer higher than the standard operating temperature. This indicates that hot spot temperature increases by 2⁰C for every 1% increase of non-linear loading of rated capacity of transformer along with 2% increase in size of transformer for the case of winter whereas the increase in temperature is 2.7⁰C and size increase by 3% in case of summer.Item KEYPHRASE DETECTION AND QUESTION GENERATION FROM TEXT USING MACHINE LEARNING(I.O.E. Pulchowk Campus, 2023-04-30) LAMICHHANE, AAYUSHQuestion Generation may not be as prominent as Question Answering but it still remains a relevant task in NLP. The ability to ask meaningful questions provides evidence towards comprehension within an Artificial Intelligence (AI) model. This makes the task of question generation important in the bigger picture of AI. While existing question generation techniques rely on complex model architectures and additional mechanisms to boost performance, we show that transformer-based fine-tuning techniques can create robust question generating systems using only a single language model,, without the use of additional mechanisms, answer metadata, and extensive features. Some training parameters of our project are : epoch :10, batchsize : 4, learning rate 10e-3.Lastly, we also look into the model’s failure modes and identify possible reasons why the model fails.