• 제목/요약/키워드: Quantitative Approaches

검색결과 416건 처리시간 0.022초

Metadiscourse in the Bank Negara Malaysia Governor's Speech Texts

  • Aziz, Roslina Abdul;Baharum, Norzie Diana
    • 아시아태평양코퍼스연구
    • /
    • 제2권2호
    • /
    • pp.1-15
    • /
    • 2021
  • The study aims to explore the use of metadiscourse in the Bank Negara Malaysia Governor's speeches based on Hyland's Interpersonal Model of Metadiscourse. The corpus data consist of 343 speech texts, which were extracted from the Malaysian Corpus of Financial English (MacFE), amounting to 688,778 tokens. Adopting both quantitative and qualitative approaches to data analysis the study investigates (1) the overall use of metadiscourse in the Bank Negara Governor's speech texts and (2) the functions of the most prominent metadiscourse resources used and their functions in the speech texts. The findings reveal that the Governor's speech texts to be interactional rather than interactive, revealing a rich distribution of interactional metadiscourse resources, namely engagement markers, self-mention, hedges, boosters and attitude markers throughout the texts. The interactional metadiscourse resources function to establish speaker-audience engagement and alignment of views, as well as to express degree of uncertainty and certainty and attitudes. The study concludes that the speech texts are not merely informational or propositional, but rather interpersonal.

Equal Gain Block Decomposition Methods for Multiuser MIMO Networks

  • Hwang, Insoo;Kang, Inseok;Hwang, Intae;You, Cheolwoo
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제15권3호
    • /
    • pp.1156-1173
    • /
    • 2021
  • In this paper, we propose a new joint precoder and postcoder design strategy to support multiple streams per user in multiuser multiple-input multiple-output (MIMO) systems. We propose two step precoding strategies using equal channel gain decomposition and block diagonalization at the transmitter. With the proposed precoder, the multiuser MIMO channel can be decomposed into multiple parallel channels with equal channel gain per user. After applying receive postcoder which is generated and sent by the transmitter, we can use ML based decoder per stream to achieve full receive diversity. Achievable sum rate bound and diversity performance of the proposed algorithm are presented with feedback signaling design and quantitative complexity analysis. Simulation results show that the proposed algorithm asymptotically approaches to the sum rate capacity of the MIMO broadcast channel while maintaining full diversity order.

THE PRICE OF RISK IN CONSTRUCTION PROJECTS: CONTINGENCY APPROXIMATION MODEL (CAM)

  • S. Laryea;E. Badu;I. K. Dontwi
    • 국제학술발표논문집
    • /
    • The 2th International Conference on Construction Engineering and Project Management
    • /
    • pp.106-118
    • /
    • 2007
  • Little attention has been focussed on a precise definition and evaluation mechanism for project management risk specifically related to contractors. When bidding, contractors traditionally price risks using unsystematic approaches. The high business failure rate our industry records may indicate that the current unsystematic mechanisms contractors use for building up contingencies may be inadequate. The reluctance of some contractors to include a price for risk in their tenders when bidding for work competitively may also not be a useful approach. Here, instead, we first define the meaning of contractor contingency, and then we develop a facile quantitative technique that contractors can use to estimate a price for project risk. This model will help contractors analyse their exposure to project risks; and also help them express the risk in monetary terms for management action. When bidding for work, they can decide how to allocate contingencies strategically in a way that balances risk and reward.

  • PDF

Empirical Analysis of Starting Salaries of College Graduates based on Their University-Industry Cooperation Activities

  • Mun-Su Park
    • International Journal of Knowledge Content Development & Technology
    • /
    • 제13권1호
    • /
    • pp.101-109
    • /
    • 2023
  • Fifteen years have passed since the enactment of the Industrial Technology Innovation Promotion Act, which promoted industry cooperation activities for universities. Therefore, the study analyzes the relationship between the university's industrial cooperation activities and the college graduates' starting salaries and provides policy suggestions on improving the direction of university-industry cooperation. The study used nine-year panel data from Graduates Occupational Mobility Survey (GOMS) to conduct an empirical analysis and found that starting salaries of college graduates were not significantly higher if the university only participated in basic industry cooperation activities. On the other hand, when the quality of university-industry cooperation activities was higher, such as job search support, the starting salary of college graduates was higher. The findings suggest that university-industry cooperation activities must focus on qualitative performances rather than quantitative approaches.

Multi-Focus Image Fusion Using Transformation Techniques: A Comparative Analysis

  • Ali Alferaidi
    • International Journal of Computer Science & Network Security
    • /
    • 제23권4호
    • /
    • pp.39-47
    • /
    • 2023
  • This study compares various transformation techniques for multifocus image fusion. Multi-focus image fusion is a procedure of merging multiple images captured at unalike focus distances to produce a single composite image with improved sharpness and clarity. In this research, the purpose is to compare different popular frequency domain approaches for multi-focus image fusion, such as Discrete Wavelet Transforms (DWT), Stationary Wavelet Transforms (SWT), DCT-based Laplacian Pyramid (DCT-LP), Discrete Cosine Harmonic Wavelet Transform (DC-HWT), and Dual-Tree Complex Wavelet Transform (DT-CWT). The objective is to increase the understanding of these transformation techniques and how they can be utilized in conjunction with one another. The analysis will evaluate the 10 most crucial parameters and highlight the unique features of each method. The results will help determine which transformation technique is the best for multi-focus image fusion applications. Based on the visual and statistical analysis, it is suggested that the DCT-LP is the most appropriate technique, but the results also provide valuable insights into choosing the right approach.

Single-Cell Genomics for Investigating Pathogenesis of Inflammatory Diseases

  • Seyoung Jung;Jeong Seok Lee
    • Molecules and Cells
    • /
    • 제46권2호
    • /
    • pp.120-129
    • /
    • 2023
  • Recent technical advances have enabled unbiased transcriptomic and epigenetic analysis of each cell, known as "single-cell analysis". Single-cell analysis has a variety of technical approaches to investigate the state of each cell, including mRNA levels (transcriptome), the immune repertoire (immune repertoire analysis), cell surface proteins (surface proteome analysis), chromatin accessibility (epigenome), and accordance with genome variants (eQTLs; expression quantitative trait loci). As an effective tool for investigating robust immune responses in coronavirus disease 2019 (COVID-19), many researchers performed single-cell analysis to capture the diverse, unbiased immune cell activation and differentiation. Despite challenges elucidating the complicated immune microenvironments of chronic inflammatory diseases using existing experimental methods, it is now possible to capture the simultaneous immune features of different cell types across inflamed tissues using various single-cell tools. In this review, we introduce patient-based and experimental mouse model research utilizing single-cell analyses in the field of chronic inflammatory diseases, as well as multi-organ atlas targeting immune cells.

PRIVATE DEVELOPERS' UNDERSTANDING ON THE IMPLEMENTATION OF STRATEGIC PARTNERING IN THE MALAYSIAN CONSTRUCTION INDUSTRY

  • Faridah Muhamad Halil;Mohammad Fadhil Mohammad;Rohana Mahbub;Ani Saifuza Shukor
    • 국제학술발표논문집
    • /
    • The 4th International Conference on Construction Engineering and Project Management Organized by the University of New South Wales
    • /
    • pp.542-549
    • /
    • 2011
  • This research will attempt to reveal that private developers in the Malaysian construction industry have been practicing strategic partnering in their organizations. While the investigation was conducted using quantitative and qualitative approaches, this paper will only reveal results obtained from the questionnaire survey. Results from the questionnaire survey indicate that private developers in the Malaysian construction industry had implemented strategic partnering in their organizations. The elements of the partnering process, which are partnering formation, partnering application and partnering completion or reactivation were tested. The results show that all the elements of the partnering process have been exercised in their projects. Thus it can be surmised that strategic partnering has been practiced by private developers in the Malaysian construction industry.

  • PDF

Measuring Industry Regulations Using an Agent-based Model: The Case of Online Games in Korea

  • Taekyung Kim;Seongmin Jeon;Jongil Kim
    • Asia pacific journal of information systems
    • /
    • 제29권2호
    • /
    • pp.165-180
    • /
    • 2019
  • As game industry prospers, the negative side of games becomes highlighted as well as its contributions to economy growth. In spite of strong arguments for the necessity to regulations as a means to decrease addiction or overindulgence, research has produced future suggestions rather than quantifiable evidence. In this paper, we propose adopting a simulation approach in addition to quantitative approaches to better understand optimal regulatory levels since a simulation approach can visualize unexpected side effects of regulations. In this study, we suggest the application of an agent-based model (ABM) as a smart service to measure the effects of regulatory policies. We review cases applying ABM in various domains and consider the possibility of using an ABM to understand the effectiveness of web board-game regulations. We find that the ABM approach would be useful in several areas, such as the analysis of regulatory effects that reflect a variety of characteristics, the measurement of micro-regulatory effects, and the simulation of regulations.

무한사면모델과 수리학적 모델의 결합을 통한 강원도 진부지역의 산사태 취약성 분석 (Assessment of Landslide Susceptibility using a Coupled Infinite Slope Model and Hydrologic Model in Jinbu Area, Gangwon-Do)

  • 이정현;박혁진
    • 자원환경지질
    • /
    • 제45권6호
    • /
    • pp.697-707
    • /
    • 2012
  • 정량적인 산사태 취약성 분석은 산사태를 유발하는 인자 및 모델에 대한 접근방법에 따라 통계적 기법과 지질역학적 기법으로 구분된다. 이 중 지질역학적 기법은 산사태 모델을 가정하고 사면의 기하학적 특성과 사면 구성물질의 공학적 특성을 고려하여 산사태의 취약성을 판단하는 기법으로 산사태의 발생메커니즘과 과정을 고려할 수 있다는 장점을 가지고 있어 산사태의 취약성 분석에 가장 효과적인 기법 중의 하나로 보고되고 있다. 지질역학적 해석기법의 경우 최근 들어 무한사면모델이 주로 사면 모델로 사용되고 있으며 GIS의 활용을 통해 광역적인 지역에 대한 분석이 가능해짐에 따라 무한사면모델을 이용한 광역적인 지역에서의 산사태 취약성 분석이 가능해졌다. 기존의 무한사면모델을 활용한 연구의 경우 연구지역의 지하수위를 지반이나 강우의 특성에 대한 고려 없이 임의로 가정하여 해석함으로써 강우량과 연구지역의 지반특성에 따라 지하수위가 유동적으로 포화되는 것을 전혀 고려할 수 없는 문제점을 가지고 있다. 본 연구에서는 이를 보완하기 위해 산사태의 유발에 가장 큰 영향을 미치는 강우강도와 지반의 수리특성을 반영할 수 있는 수리학적 모델을 무한사면모델과 결합하여 연구지역의 현장 조건을 반영한 산사태 취약성 분석을 수행하였다. 또한 기존의 해석방법과 본 연구에서 제안된 해석기법을 비교분석하기 위하여 2006년 7월 대규모의 산사태가 발생한 강원도 진부지역을 대상으로 분석을 수행하였다. 그 결과 본 연구에서 제안된 해석기법이 기존의 해석기법에 비해 높은 예측 정확도를 보이는 것으로 분석되었다.

Hardware Approach to Fuzzy Inference―ASIC and RISC―

  • Watanabe, Hiroyuki
    • 한국지능시스템학회:학술대회논문집
    • /
    • 한국퍼지및지능시스템학회 1993년도 Fifth International Fuzzy Systems Association World Congress 93
    • /
    • pp.975-976
    • /
    • 1993
  • This talk presents the overview of the author's research and development activities on fuzzy inference hardware. We involved it with two distinct approaches. The first approach is to use application specific integrated circuits (ASIC) technology. The fuzzy inference method is directly implemented in silicon. The second approach, which is in its preliminary stage, is to use more conventional microprocessor architecture. Here, we use a quantitative technique used by designer of reduced instruction set computer (RISC) to modify an architecture of a microprocessor. In the ASIC approach, we implemented the most widely used fuzzy inference mechanism directly on silicon. The mechanism is beaded on a max-min compositional rule of inference, and Mandami's method of fuzzy implication. The two VLSI fuzzy inference chips are designed, fabricated, and fully tested. Both used a full-custom CMOS technology. The second and more claborate chip was designed at the University of North Carolina(U C) in cooperation with MCNC. Both VLSI chips had muliple datapaths for rule digital fuzzy inference chips had multiple datapaths for rule evaluation, and they executed multiple fuzzy if-then rules in parallel. The AT & T chip is the first digital fuzzy inference chip in the world. It ran with a 20 MHz clock cycle and achieved an approximately 80.000 Fuzzy Logical inferences Per Second (FLIPS). It stored and executed 16 fuzzy if-then rules. Since it was designed as a proof of concept prototype chip, it had minimal amount of peripheral logic for system integration. UNC/MCNC chip consists of 688,131 transistors of which 476,160 are used for RAM memory. It ran with a 10 MHz clock cycle. The chip has a 3-staged pipeline and initiates a computation of new inference every 64 cycle. This chip achieved an approximately 160,000 FLIPS. The new architecture have the following important improvements from the AT & T chip: Programmable rule set memory (RAM). On-chip fuzzification operation by a table lookup method. On-chip defuzzification operation by a centroid method. Reconfigurable architecture for processing two rule formats. RAM/datapath redundancy for higher yield It can store and execute 51 if-then rule of the following format: IF A and B and C and D Then Do E, and Then Do F. With this format, the chip takes four inputs and produces two outputs. By software reconfiguration, it can store and execute 102 if-then rules of the following simpler format using the same datapath: IF A and B Then Do E. With this format the chip takes two inputs and produces one outputs. We have built two VME-bus board systems based on this chip for Oak Ridge National Laboratory (ORNL). The board is now installed in a robot at ORNL. Researchers uses this board for experiment in autonomous robot navigation. The Fuzzy Logic system board places the Fuzzy chip into a VMEbus environment. High level C language functions hide the operational details of the board from the applications programme . The programmer treats rule memories and fuzzification function memories as local structures passed as parameters to the C functions. ASIC fuzzy inference hardware is extremely fast, but they are limited in generality. Many aspects of the design are limited or fixed. We have proposed to designing a are limited or fixed. We have proposed to designing a fuzzy information processor as an application specific processor using a quantitative approach. The quantitative approach was developed by RISC designers. In effect, we are interested in evaluating the effectiveness of a specialized RISC processor for fuzzy information processing. As the first step, we measured the possible speed-up of a fuzzy inference program based on if-then rules by an introduction of specialized instructions, i.e., min and max instructions. The minimum and maximum operations are heavily used in fuzzy logic applications as fuzzy intersection and union. We performed measurements using a MIPS R3000 as a base micropro essor. The initial result is encouraging. We can achieve as high as a 2.5 increase in inference speed if the R3000 had min and max instructions. Also, they are useful for speeding up other fuzzy operations such as bounded product and bounded sum. The embedded processor's main task is to control some device or process. It usually runs a single or a embedded processer to create an embedded processor for fuzzy control is very effective. Table I shows the measured speed of the inference by a MIPS R3000 microprocessor, a fictitious MIPS R3000 microprocessor with min and max instructions, and a UNC/MCNC ASIC fuzzy inference chip. The software that used on microprocessors is a simulator of the ASIC chip. The first row is the computation time in seconds of 6000 inferences using 51 rules where each fuzzy set is represented by an array of 64 elements. The second row is the time required to perform a single inference. The last row is the fuzzy logical inferences per second (FLIPS) measured for ach device. There is a large gap in run time between the ASIC and software approaches even if we resort to a specialized fuzzy microprocessor. As for design time and cost, these two approaches represent two extremes. An ASIC approach is extremely expensive. It is, therefore, an important research topic to design a specialized computing architecture for fuzzy applications that falls between these two extremes both in run time and design time/cost. TABLEI INFERENCE TIME BY 51 RULES {{{{Time }}{{MIPS R3000 }}{{ASIC }}{{Regular }}{{With min/mix }}{{6000 inference 1 inference FLIPS }}{{125s 20.8ms 48 }}{{49s 8.2ms 122 }}{{0.0038s 6.4㎲ 156,250 }} }}

  • PDF