• Title/Summary/Keyword: Computational

Search Result 26,485, Processing Time 0.052 seconds

Design of a pilot-scale helium heating system to support the SI cycle (파이롯 규모 SI 공정 시험 설비에서의 헬륨 가열 장치 설계)

  • Jang, Se-Hyun;Choi, Yong-Suk;Lee, Ki-Young;Shin, Young-Joon;Lee, Tae-Hoon;Kim, Jong-Ho;Yoon, Seok-Hun;Choi, Jae-Hyuk
    • Journal of Advanced Marine Engineering and Technology
    • /
    • v.40 no.3
    • /
    • pp.157-164
    • /
    • 2016
  • In this study, researchers performed preliminary design and numerical analysis for a pilot-scale helium heating system intended to support full-scale construction for a sulfur-iodine (SI) cycle. The helium heat exchanger used a liquefied petroleum gas (LPG) combustor. Exhaust gas velocity at the heat exchanger outlet was approximately 40 m/s based on computational thermal and flow analysis. The maximum gas temperature was reached with six baffles in the design; lower gas temperatures were observed with four baffles. The amount of heat transfer was also higher with six baffles. Installation of additional baffles may reduce fuel costs because of the reduced LPG exhausted to the heat exchanger. However, additional baffles may also increase the pressure difference between the exchanger's inlet and outlet. Therefore, it is important to find the optimum number of baffles. Structural analysis, followed by thermal and flow analysis, indicated a 3.86 mm thermal expansion at the middle of the shell and tube type heat exchanger when both ends were supported. Structural analysis conditions included a helium flow rate of 3.729 mol/s and a helium outlet temperature of $910^{\circ}C$. An exhaust gas temperature of $1300^{\circ}C$ and an exhaust gas rate of 52 g/s were confirmed to achieve the helium outlet temperature of $910^{\circ}C$ with an exchanger inlet temperature of $135^{\circ}C$ in an LPG-fueled helium heating system.

An Analysis of Big Video Data with Cloud Computing in Ubiquitous City (클라우드 컴퓨팅을 이용한 유시티 비디오 빅데이터 분석)

  • Lee, Hak Geon;Yun, Chang Ho;Park, Jong Won;Lee, Yong Woo
    • Journal of Internet Computing and Services
    • /
    • v.15 no.3
    • /
    • pp.45-52
    • /
    • 2014
  • The Ubiquitous-City (U-City) is a smart or intelligent city to satisfy human beings' desire to enjoy IT services with any device, anytime, anywhere. It is a future city model based on Internet of everything or things (IoE or IoT). It includes a lot of video cameras which are networked together. The networked video cameras support a lot of U-City services as one of the main input data together with sensors. They generate huge amount of video information, real big data for the U-City all the time. It is usually required that the U-City manipulates the big data in real-time. And it is not easy at all. Also, many times, it is required that the accumulated video data are analyzed to detect an event or find a figure among them. It requires a lot of computational power and usually takes a lot of time. Currently we can find researches which try to reduce the processing time of the big video data. Cloud computing can be a good solution to address this matter. There are many cloud computing methodologies which can be used to address the matter. MapReduce is an interesting and attractive methodology for it. It has many advantages and is getting popularity in many areas. Video cameras evolve day by day so that the resolution improves sharply. It leads to the exponential growth of the produced data by the networked video cameras. We are coping with real big data when we have to deal with video image data which are produced by the good quality video cameras. A video surveillance system was not useful until we find the cloud computing. But it is now being widely spread in U-Cities since we find some useful methodologies. Video data are unstructured data thus it is not easy to find a good research result of analyzing the data with MapReduce. This paper presents an analyzing system for the video surveillance system, which is a cloud-computing based video data management system. It is easy to deploy, flexible and reliable. It consists of the video manager, the video monitors, the storage for the video images, the storage client and streaming IN component. The "video monitor" for the video images consists of "video translater" and "protocol manager". The "storage" contains MapReduce analyzer. All components were designed according to the functional requirement of video surveillance system. The "streaming IN" component receives the video data from the networked video cameras and delivers them to the "storage client". It also manages the bottleneck of the network to smooth the data stream. The "storage client" receives the video data from the "streaming IN" component and stores them to the storage. It also helps other components to access the storage. The "video monitor" component transfers the video data by smoothly streaming and manages the protocol. The "video translator" sub-component enables users to manage the resolution, the codec and the frame rate of the video image. The "protocol" sub-component manages the Real Time Streaming Protocol (RTSP) and Real Time Messaging Protocol (RTMP). We use Hadoop Distributed File System(HDFS) for the storage of cloud computing. Hadoop stores the data in HDFS and provides the platform that can process data with simple MapReduce programming model. We suggest our own methodology to analyze the video images using MapReduce in this paper. That is, the workflow of video analysis is presented and detailed explanation is given in this paper. The performance evaluation was experiment and we found that our proposed system worked well. The performance evaluation results are presented in this paper with analysis. With our cluster system, we used compressed $1920{\times}1080(FHD)$ resolution video data, H.264 codec and HDFS as video storage. We measured the processing time according to the number of frame per mapper. Tracing the optimal splitting size of input data and the processing time according to the number of node, we found the linearity of the system performance.

Wintertime Extreme Storm Waves in the East Sea: Estimation of Extreme Storm Waves and Wave-Structure Interaction Study in the Fushiki Port, Toyama Bay (동해의 동계 극한 폭풍파랑: 토야마만 후시키항의 극한 폭풍파랑 추산 및 파랑 · 구조물 상호작용 연구)

  • Lee, Han Soo;Komaguchi, Tomoaki;Yamamoto, Atsushi;Hara, Masanori
    • Journal of Korean Society of Coastal and Ocean Engineers
    • /
    • v.25 no.5
    • /
    • pp.335-347
    • /
    • 2013
  • In February 2008, high storm waves due to a developed atmospheric low pressure system propagating from the west off Hokkaido, Japan, to the south and southwest throughout the East Sea (ES) caused extensive damages along the central coast of Japan and along the east coast of Korea. This study consists of two parts. In the first part, we estimate extreme storm wave characteristics in the Toyama Bay where heavy coastal damages occurred, using a non-hydrostatic meteorological model and a spectral wave model by considering the extreme conditions for two factors for wind wave growth, such as wind intensity and duration. The estimated extreme significant wave height and corresponding wave period were 6.78 m and 18.28 sec, respectively, at the Fushiki Toyama. In the second part, we perform numerical experiments on wave-structure interaction in the Fushiki Port, Toyama Bay, where the long North-Breakwater was heavily damaged by the storm waves in February 2008. The experiments are conducted using a non-linear shallow-water equation model with adaptive mesh refinement (AMR) and wet-dry scheme. The estimated extreme storm waves of 6.78 m and 18.28 sec are used for incident wave profile. The results show that the Fushiki Port would be overtopped and flooded by extreme storm waves if the North-Breakwater does not function properly after being damaged. Also the storm waves would overtop seawalls and sidewalls of the Manyou Pier behind the North-Breakwater. The results also depict that refined meshes by AMR method with wet-dry scheme applied capture the coastline and coastal structure well while keeping the computational load efficiently.

Study of Coherent High-Power Electromagnetic Wave Generation Based on Cherenkov Radiation Using Plasma Wakefield Accelerator with Relativistic Electron Beam in Vacuum (진공 내 상대론적인 영역의 전자빔을 이용한 플라즈마 항적장 가속기 기반 체렌코프 방사를 통한 결맞는 고출력 전자파 발생 기술 연구)

  • Min, Sun-Hong;Kwon, Ohjoon;Sattorov, Matlabjon;Baek, In-Keun;Kim, Seontae;Hong, Dongpyo;Jang, Jungmin;Bhattacharya, Ranajoy;Cho, Ilsung;Kim, Byungsu;Park, Chawon;Jung, Wongyun;Park, Seunghyuk;Park, Gun-Sik
    • The Journal of Korean Institute of Electromagnetic Engineering and Science
    • /
    • v.29 no.6
    • /
    • pp.407-410
    • /
    • 2018
  • As the operating frequency of an electromagnetic wave increases, the maximum output and wavelength of the wave decreases, so that the size of the circuit cannot be reduced. As a result, the fabrication of a circuit with high power (of the order of or greater than kW range) and terahertz wave frequency band is limited, due to the problem of circuit size, to the order of ${\mu}m$ to mm. In order to overcome these limitations, we propose a source design technique for 0.1 THz~0.3 GW level with cylindrical shape (diameter ~2.4 cm). Modeling and computational simulations were performed to optimize the design of the high-power electromagnetic sources based on Cherenkov radiation generation technology using the principle of plasma wakefield acceleration with ponderomotive force and artificial dielectrics. An effective design guideline has been proposed to facilitate the fabrication of high-power terahertz wave vacuum devices of large diameter that are less restricted in circuit size through objective verification.

Recent Progress in Air Conditioning and Refrigeration Research -A Review of Papers Published in the Korean Journal of Air-Conditioning and Refrigeration Engineering in 2000 and 2001- (공기조화, 냉동 분야의 최근 연구 동향 -2000년 및 2001년 학회지 논문에 대한 종합적 고찰 -)

  • 강신형;한화택;조금남;이승복;조형희;김민수
    • Korean Journal of Air-Conditioning and Refrigeration Engineering
    • /
    • v.14 no.12
    • /
    • pp.1102-1139
    • /
    • 2002
  • A review on the papers published in the Korean Journal of Air-Conditioning and Refrigerating Engineering in 2000 and 2001 has been done. Focus has been put on current status of research in the aspect of heating, cooling, ventilation, sanitation and building environment. The conclusions are as follows. (1) Most of fundamental studies on fluid flow were related with heat transportation of facilities. Drop formation and rivulet flow on solid surfaces were interesting topics related with condensation augmentation. Research on micro environment considering flow, heat, humidity was also interesting for comfortable living environment. It can be extended considering biological aspects. Development of fans and blowers of high performance and low noise were continuing topics. Well developed CFD technologies were widely applied for developing facilities and their systems. (2) Most of papers related with heat transfer analysis and heat exchanger shows dealt with convection, evaporation, and channel flow for the design application of heat exchanger. The numerical heat transfer simulation studies have been peformed and reported to show heat transfer characteristics. Experimental as well as numerical studies on heat exchanger were reported, while not many papers are available for the system analysis including heat exchanger. (3) A review of the recent studies on heat pump system shows that performance analysis and control of heat pump have been peformed by various simulations and experiments. The research papers on multi-type heat pump system increased significantly. The studies on heat pipe have been examined experimently for change of working characteristics and strut lure. Research on the phase change has been carried out steadily and operation strategies of encapsulated ice storage tank are reported experimentally in several papers. (4) A review of recent studies on refrigeration/air conditioning system have focused on the system performance and efficiency for new alternative refrigerants. Evaporation and condensation heat transfer characteristics are investigated for tube shapes and new alternative refrigerants. Studies on components of refrigeration/air conditioning system are carried to examine efficiency for various compressors and performance of new expansion devices. In addition to thermophysical properties of refrigerant mixtures, studies on new refrigerants are also carried out, however research works on two-phase flow seemed to be insufficient. (5) A review of the recent studies on absorption cooling system indicates that heat and mass transfer phenomena have been investigated to improve absorber performance. Various experimental data have been presented and several simulation models have been proposed. A review of the recent studies on duct and ventilation shows that ventilation indices have been proposed to quantify the ventilation performance in buildings and tunnels. Main efforts have been focused on the applications of ventilation effectiveness in practice, either numerically using computational fluid dynamics or experimentally using tracer gas techniques. (6) Based on a review of recent studies on indoor thermal environment and building service systems, research issues have mainly focused on many innovative ideas such as underfloor air-conditioning system, personal environmental modules, radiant floor cooling and etc. Also, the new approaches for minimizing energy consumption as well as improving indoor environmental conditions through predictive control of HVAC systems, various activities of building energy management and cost-benefit analysis for economic evaluation were highlighted.

Analyzing the Characteristics of Pre-service Elementary School Teachers' Modeling and Epistemic Criteria with the Blackbox Simulation Program (블랙박스 시뮬레이션에 참여한 초등예비교사의 모형 구성의 특징과 인식적 기준)

  • Park, Jeongwoo;Lee, Sun-Kyung;Shim, Han Su;Lee, Gyeong-Geon;Shin, Myeong-Kyeong
    • Journal of The Korean Association For Science Education
    • /
    • v.38 no.3
    • /
    • pp.305-317
    • /
    • 2018
  • In this study, we investigated the characteristics of participant students' modeling with the blackbox simulation program and epistemic criteria. For this research, we developed a blackbox simulation program, which is an ill-structured problem situation reflecting the scientific practice. This simulation program is applied in the activities. 23 groups, 89 second year students of an education college participated in this activity. They visualized, modeled, modified, and evaluated their thoughts on internal structure in the blackbox. All of students' activities were recorded and analyzed. As a result, the students' models in blackbox activities were categorized into four types considering their form and function. Model evaluation occurred in group model selection. Epistemic criteria such as empirical coherence, comprehensiveness, analogy, simplicity, and implementation were adapted in model evaluation. The educational implications discussed above are as follows: First, the blackbox simulation activities in which the students participated in this study have educational implications in that they provide a context in which the nature of scientific practice can be experienced explicitly and implicitly by constructing and testing models. Second, from the beginning of the activity, epistemic criteria such as empirical coherence, comprehensiveness, analogy, simplicity, and implementation were not strictly adapted and dynamically flexibly adapted according to the context. Third, the study of epistemic criteria in various contexts as well as in the context of this study will broaden the horizon of understanding the nature of scientific practice. Simulation activity, which is the context of this study, can lead to research related to computational thinking that will be more important in future society. We expect to be able to lead more discussions by furthering this study by elaborating and systematizing its context and method.

Implementation of Markerless Augmented Reality with Deformable Object Simulation (변형물체 시뮬레이션을 활용한 비 마커기반 증강현실 시스템 구현)

  • Sung, Nak-Jun;Choi, Yoo-Joo;Hong, Min
    • Journal of Internet Computing and Services
    • /
    • v.17 no.4
    • /
    • pp.35-42
    • /
    • 2016
  • Recently many researches have been focused on the use of the markerless augmented reality system using face, foot, and hand of user's body to alleviate many disadvantages of the marker based augmented reality system. In addition, most existing augmented reality systems have been utilized rigid objects since they just desire to insert and to basic interaction with virtual object in the augmented reality system. In this paper, unlike restricted marker based augmented reality system with rigid objects that is based in display, we designed and implemented the markerless augmented reality system using deformable objects to apply various fields for interactive situations with a user. Generally, deformable objects can be implemented with mass-spring modeling and the finite element modeling. Mass-spring model can provide a real time simulation and finite element model can achieve more accurate simulation result in physical and mathematical view. In this paper, the proposed markerless augmented reality system utilize the mass-spring model using tetraheadron structure to provide real-time simulation result. To provide plausible simulated interaction result with deformable objects, the proposed method detects and tracks users hand with Kinect SDK and calculates the external force which is applied to the object on hand based on the position change of hand. Based on these force, 4th order Runge-Kutta Integration is applied to compute the next position of the deformable object. In addition, to prevent the generation of excessive external force by hand movement that can provide the natural behavior of deformable object, we set up the threshold value and applied this value when the hand movement is over this threshold. Each experimental test has been repeated 5 times and we analyzed the experimental result based on the computational cost of simulation. We believe that the proposed markerless augmented reality system with deformable objects can overcome the weakness of traditional marker based augmented reality system with rigid object that are not suitable to apply to other various fields including healthcare and education area.

Using the METHONTOLOGY Approach to a Graduation Screen Ontology Development: An Experiential Investigation of the METHONTOLOGY Framework

  • Park, Jin-Soo;Sung, Ki-Moon;Moon, Se-Won
    • Asia pacific journal of information systems
    • /
    • v.20 no.2
    • /
    • pp.125-155
    • /
    • 2010
  • Ontologies have been adopted in various business and scientific communities as a key component of the Semantic Web. Despite the increasing importance of ontologies, ontology developers still perceive construction tasks as a challenge. A clearly defined and well-structured methodology can reduce the time required to develop an ontology and increase the probability of success of a project. However, no reliable knowledge-engineering methodology for ontology development currently exists; every methodology has been tailored toward the development of a particular ontology. In this study, we developed a Graduation Screen Ontology (GSO). The graduation screen domain was chosen for the several reasons. First, the graduation screen process is a complicated task requiring a complex reasoning process. Second, GSO may be reused for other universities because the graduation screen process is similar for most universities. Finally, GSO can be built within a given period because the size of the selected domain is reasonable. No standard ontology development methodology exists; thus, one of the existing ontology development methodologies had to be chosen. The most important considerations for selecting the ontology development methodology of GSO included whether it can be applied to a new domain; whether it covers a broader set of development tasks; and whether it gives sufficient explanation of each development task. We evaluated various ontology development methodologies based on the evaluation framework proposed by G$\acute{o}$mez-P$\acute{e}$rez et al. We concluded that METHONTOLOGY was the most applicable to the building of GSO for this study. METHONTOLOGY was derived from the experience of developing Chemical Ontology at the Polytechnic University of Madrid by Fern$\acute{a}$ndez-L$\acute{o}$pez et al. and is regarded as the most mature ontology development methodology. METHONTOLOGY describes a very detailed approach for building an ontology under a centralized development environment at the conceptual level. This methodology consists of three broad processes, with each process containing specific sub-processes: management (scheduling, control, and quality assurance); development (specification, conceptualization, formalization, implementation, and maintenance); and support process (knowledge acquisition, evaluation, documentation, configuration management, and integration). An ontology development language and ontology development tool for GSO construction also had to be selected. We adopted OWL-DL as the ontology development language. OWL was selected because of its computational quality of consistency in checking and classification, which is crucial in developing coherent and useful ontological models for very complex domains. In addition, Protege-OWL was chosen for an ontology development tool because it is supported by METHONTOLOGY and is widely used because of its platform-independent characteristics. Based on the GSO development experience of the researchers, some issues relating to the METHONTOLOGY, OWL-DL, and Prot$\acute{e}$g$\acute{e}$-OWL were identified. We focused on presenting drawbacks of METHONTOLOGY and discussing how each weakness could be addressed. First, METHONTOLOGY insists that domain experts who do not have ontology construction experience can easily build ontologies. However, it is still difficult for these domain experts to develop a sophisticated ontology, especially if they have insufficient background knowledge related to the ontology. Second, METHONTOLOGY does not include a development stage called the "feasibility study." This pre-development stage helps developers ensure not only that a planned ontology is necessary and sufficiently valuable to begin an ontology building project, but also to determine whether the project will be successful. Third, METHONTOLOGY excludes an explanation on the use and integration of existing ontologies. If an additional stage for considering reuse is introduced, developers might share benefits of reuse. Fourth, METHONTOLOGY fails to address the importance of collaboration. This methodology needs to explain the allocation of specific tasks to different developer groups, and how to combine these tasks once specific given jobs are completed. Fifth, METHONTOLOGY fails to suggest the methods and techniques applied in the conceptualization stage sufficiently. Introducing methods of concept extraction from multiple informal sources or methods of identifying relations may enhance the quality of ontologies. Sixth, METHONTOLOGY does not provide an evaluation process to confirm whether WebODE perfectly transforms a conceptual ontology into a formal ontology. It also does not guarantee whether the outcomes of the conceptualization stage are completely reflected in the implementation stage. Seventh, METHONTOLOGY needs to add criteria for user evaluation of the actual use of the constructed ontology under user environments. Eighth, although METHONTOLOGY allows continual knowledge acquisition while working on the ontology development process, consistent updates can be difficult for developers. Ninth, METHONTOLOGY demands that developers complete various documents during the conceptualization stage; thus, it can be considered a heavy methodology. Adopting an agile methodology will result in reinforcing active communication among developers and reducing the burden of documentation completion. Finally, this study concludes with contributions and practical implications. No previous research has addressed issues related to METHONTOLOGY from empirical experiences; this study is an initial attempt. In addition, several lessons learned from the development experience are discussed. This study also affords some insights for ontology methodology researchers who want to design a more advanced ontology development methodology.

Comparison of Algorithms for Generating Parametric Image of Cerebral Blood Flow Using ${H_2}^{15}O$ PET Positron Emission Tomography (${H_2}^{15}O$ PET을 이용한 뇌혈류 파라메트릭 영상 구성을 위한 알고리즘 비교)

  • Lee, Jae-Sung;Lee, Dong-Soo;Park, Kwang-Suk;Chung, June-Key;Lee, Myung-Chul
    • The Korean Journal of Nuclear Medicine
    • /
    • v.37 no.5
    • /
    • pp.288-300
    • /
    • 2003
  • Purpose: To obtain regional blood flow and tissue-blood partition coefficient with time-activity curves from ${H_2}^{15}O$ PET, fitting of some parameters in the Kety model is conventionally accomplished by nonlinear least squares (NLS) analysis. However, NLS requires considerable compuation time then is impractical for pixel-by-pixel analysis to generate parametric images of these parameters. In this study, we investigated several fast parameter estimation methods for the parametric image generation and compared their statistical reliability and computational efficiency. Materials and Methods: These methods included linear least squres (LLS), linear weighted least squares (LWLS), linear generalized least squares (GLS), linear generalized weighted least squares (GWLS), weighted Integration (WI), and model-based clustering method (CAKS). ${H_2}^{15}O$ dynamic brain PET with Poisson noise component was simulated using numerical Zubal brain phantom. Error and bias in the estimation of rCBF and partition coefficient, and computation time in various noise environments was estimated and compared. In audition, parametric images from ${H_2}^{15}O$ dynamic brain PET data peformed on 16 healthy volunteers under various physiological conditions was compared to examine the utility of these methods for real human data. Results: These fast algorithms produced parametric images with similar image qualify and statistical reliability. When CAKS and LLS methods were used combinedly, computation time was significantly reduced and less than 30 seconds for $128{\times}128{\times}46$ images on Pentium III processor. Conclusion: Parametric images of rCBF and partition coefficient with good statistical properties can be generated with short computation time which is acceptable in clinical situation.

A study on the use of a Business Intelligence system : the role of explanations (비즈니스 인텔리전스 시스템의 활용 방안에 관한 연구: 설명 기능을 중심으로)

  • Kwon, YoungOk
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.4
    • /
    • pp.155-169
    • /
    • 2014
  • With the rapid advances in technologies, organizations are more likely to depend on information systems in their decision-making processes. Business Intelligence (BI) systems, in particular, have become a mainstay in dealing with complex problems in an organization, partly because a variety of advanced computational methods from statistics, machine learning, and artificial intelligence can be applied to solve business problems such as demand forecasting. In addition to the ability to analyze past and present trends, these predictive analytics capabilities provide huge value to an organization's ability to respond to change in markets, business risks, and customer trends. While the performance effects of BI system use in organization settings have been studied, it has been little discussed on the use of predictive analytics technologies embedded in BI systems for forecasting tasks. Thus, this study aims to find important factors that can help to take advantage of the benefits of advanced technologies of a BI system. More generally, a BI system can be viewed as an advisor, defined as the one that formulates judgments or recommends alternatives and communicates these to the person in the role of the judge, and the information generated by the BI system as advice that a decision maker (judge) can follow. Thus, we refer to the findings from the advice-giving and advice-taking literature, focusing on the role of explanations of the system in users' advice taking. It has been shown that advice discounting could occur when an advisor's reasoning or evidence justifying the advisor's decision is not available. However, the majority of current BI systems merely provide a number, which may influence decision makers in accepting the advice and inferring the quality of advice. We in this study explore the following key factors that can influence users' advice taking within the setting of a BI system: explanations on how the box-office grosses are predicted, types of advisor, i.e., system (data mining technique) or human-based business advice mechanisms such as prediction markets (aggregated human advice) and human advisors (individual human expert advice), users' evaluations of the provided advice, and individual differences in decision-makers. Each subject performs the following four tasks, by going through a series of display screens on the computer. First, given the information of the given movie such as director and genre, the subjects are asked to predict the opening weekend box office of the movie. Second, in light of the information generated by an advisor, the subjects are asked to adjust their original predictions, if they desire to do so. Third, they are asked to evaluate the value of the given information (e.g., perceived usefulness, trust, satisfaction). Lastly, a short survey is conducted to identify individual differences that may affect advice-taking. The results from the experiment show that subjects are more likely to follow system-generated advice than human advice when the advice is provided with an explanation. When the subjects as system users think the information provided by the system is useful, they are also more likely to take the advice. In addition, individual differences affect advice-taking. The subjects with more expertise on advisors or that tend to agree with others adjust their predictions, following the advice. On the other hand, the subjects with more knowledge on movies are less affected by the advice and their final decisions are close to their original predictions. The advances in predictive analytics of a BI system demonstrate a great potential to support increasingly complex business decisions. This study shows how the designs of a BI system can play a role in influencing users' acceptance of the system-generated advice, and the findings provide valuable insights on how to leverage the advanced predictive analytics of the BI system in an organization's forecasting practices.