• 제목/요약/키워드: Hard point data

검색결과 131건 처리시간 0.122초

Radio Resource Management of CoMP System in HetNet under Power and Backhaul Constraints

  • Yu, Jia;Wu, Shaohua;Lin, Xiaodong;Zhang, Qinyu
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제8권11호
    • /
    • pp.3876-3895
    • /
    • 2014
  • Recently, Heterogeneous Network (HetNet) with Coordinated Multi-Point (CoMP) scheme is introduced into Long Term Evolution-Advanced (LTE-A) systems to improve digital services for User Equipments (UEs), especially for cell-edge UEs. However, Radio Resource Management (RRM), including Resource Block (RB) scheduling and Power Allocation (PA), in this scenario becomes challenging, due to the intercell cooperation. In this paper, we investigate the RRM problem for downlink transmission of HetNet system with Joint Processing (JP) CoMP (both joint transmission and dynamic cell selection schemes), aiming at maximizing weighted sum data rate under the constraints of both transmission power and backhaul capacity. First, joint RB scheduling and PA problem is formulated as a constrained Mixed Integer Programming (MIP) which is NP-hard. To simplify the formulation problem, we decompose it into two problems of RB scheduling and PA. For RB scheduling, we propose an algorithm with less computational complexity to achieve a suboptimal solution. Then, according to the obtained scheduling results, we present an iterative Karush-Kuhn-Tucker (KKT) method to solve the PA problem. Extensive simulations are conducted to verify the effectiveness and efficiency of the proposed algorithms. Two kinds of JP CoMP schemes are compared with a non-CoMP greedy scheme (max capacity scheme). Simulation results prove that the CoMP schemes with the proposed RRM algorithms dramatically enhance data rate of cell-edge UEs, thereby improving UEs' fairness of data rate. Also, it is shown that the proposed PA algorithms can decrease power consumption of transmission antennas without loss of transmission performance.

갑상선암 환자의 방사성요오드 치료경험 (The Experience of Receiving Radioactive Iodine Therapy among Thyroid Cancer Patients)

  • 강경옥;김현경;김지영;임석태
    • 동서간호학연구지
    • /
    • 제22권2호
    • /
    • pp.148-157
    • /
    • 2016
  • Purpose: The purpose of this study was to explore the meaning of the experience of receiving radioactive iodine therapy among patients with thyroid cancer. Methods: A qualitative research design was adopted. The participants were ten women diagnosed with thyroid cancer who had received radioactive iodine therapy within one year. Data were collected through in-depth interviews from October of 2015 to April of 2016. Individual interviews were recorded, and transcribed data were analyzed using Colaizzi's method. Results: The six categories of the experience of receiving radioactive iodine therapy were "Finally realizing having cancer," "The lonely fight that feels like prison life," "Narrower scope of life," "Lack of understanding by others," "Enduring a short, yet difficult journey," and "A turning point for a new life." Conclusion: This study provides deep insight into the experience of thyroid cancer patients who had received radioactive iodine therapy. Nurses should concern their distress during radioactive iodine treatment and manage psychological difficulties as well as physical symptoms. Support from family and health care providers may help them to overcome the hard journey.

주성분 분석을 이용한 DAMADICS 공정의 이상진단 모델 개발 (Principal Component Analysis Based Method for a Fault Diagnosis Model DAMADICS Process)

  • 박재연;이창준
    • 한국안전학회지
    • /
    • 제31권4호
    • /
    • pp.35-41
    • /
    • 2016
  • In order to guarantee the process safety and prevent accidents, the deviations from normal operating conditions should be monitored and their root causes have to be identified as soon as possible. The statistical theories-based method among various fault diagnosis methods has been gaining popularity, due to simplicity and quickness. However, according to fault magnitudes, the scalar value generated by statistical methods can be changed and this point can lead to produce wrong information. To solve this difficulty, this work employs PCA (Principal Component Analysis) based method with qualitative information. In the case study of our previous study, the number of assumed faults is much smaller than that of process variables. In the case study of this study, the number of predefined faults is 19, while that of process variables is 6. It means that a fault diagnosis becomes more difficult and it is really hard to isolate a single fault with a small number of variables. The PCA model is constructed under normal operation data in order to get a loading vector and the data set of assumed faulty conditions is applied with PCA model. The significant changes on PC (Principal Components) axes are monitored with CUSUM (Cumulative Sum Control Chart) and recorded to make the information, which can be used to identify the types of fault.

Proteomics Data Analysis using Representative Database

  • Kwon, Kyung-Hoon;Park, Gun-Wook;Kim, Jin-Young;Park, Young-Mok;Yoo, Jong-Shin
    • Bioinformatics and Biosystems
    • /
    • 제2권2호
    • /
    • pp.46-51
    • /
    • 2007
  • In the proteomics research using mass spectrometry, the protein database search gives the protein information from the peptide sequences that show the best match with the tandem mass spectra. The protein sequence database has been a powerful knowledgebase for this protein identification. However, as we accumulate the protein sequence information in the database, the database size gets to be huge. Now it becomes hard to consider all the protein sequences in the database search because it consumes much computing time. For the high-throughput analysis of the proteome, usually we have used the non-redundant refined database such as IPI human database of European Bioinformatics Institute. While the non-redundant database can supply the search result in high speed, it misses the variation of the protein sequences. In this study, we have concerned the proteomics data in the point of protein similarities and used the network analysis tool to build a new analysis method. This method will be able to save the computing time for the database search and keep the sequence variation to catch the modified peptides.

  • PDF

Abnormal Winter Melting of the Arctic Sea Ice Cap Observed by the Spaceborne Passive Microwave Sensors

  • Lee, Seongsuk;Yi, Yu
    • Journal of Astronomy and Space Sciences
    • /
    • 제33권4호
    • /
    • pp.305-311
    • /
    • 2016
  • The spatial size and variation of Arctic sea ice play an important role in Earth's climate system. These are affected by conditions in the polar atmosphere and Arctic sea temperatures. The Arctic sea ice concentration is calculated from brightness temperature data derived from the Defense Meteorological Satellite program (DMSP) F13 Special Sensor Microwave/Imagers (SSMI) and the DMSP F17 Special Sensor Microwave Imager/Sounder (SSMIS) sensors. Many previous studies point to significant reductions in sea ice and their causes. We investigated the variability of Arctic sea ice using the daily sea ice concentration data from passive microwave observations to identify the sea ice melting regions near the Arctic polar ice cap. We discovered the abnormal melting of the Arctic sea ice near the North Pole during the summer and the winter. This phenomenon is hard to explain only surface air temperature or solar heating as suggested by recent studies. We propose a hypothesis explaining this phenomenon. The heat from the deep sea in Arctic Ocean ridges and/or the hydrothermal vents might be contributing to the melting of Arctic sea ice. This hypothesis could be verified by the observation of warm water column structure below the melting or thinning arctic sea ice through the project such as Coriolis dataset for reanalysis (CORA).

항공기에서 투하되는 수중운동체의 초기정렬기법 연구 (A Study of An Initial Alignment Method of Underwater Vehicle Dropped from Aircraft)

  • 류동기;김삼수
    • 한국군사과학기술학회지
    • /
    • 제6권1호
    • /
    • pp.21-29
    • /
    • 2003
  • The Strap Down Inertial Measurement Unit(SDIMU) is recently used for the sensor package of the modern underwater vehicles such as torpedoes and unmanned underwater-vehicles. For using SDIMU, an initial alignment must be carried out before the fire or navigation stage. The general initial alignment methods require that a mother vehicle Is a stationary condition or the Inertial Navigation System(INS) of vehicle is received the specific of data navigation from the mother vehicle. But an underwater vehicle dropped from aircraft is hard to satisfy above both necessary conditions of the general initial alignment. So, we suggest a new strap down initial alignment method of an underwater vehicle dropped from aircraft without using any aided sensors. The highlight point of this method is that a period of initial alignment is not before the fire but during running stage to fix alignment error. And we verify it by analyzing various data of S/W simulations, Hardware In the Loop Simulation(HILS) tests and sea trials.

명목임금의 경직성과 고용변동성 (Nominal Wage Rigidity and Employment Volatility)

  • 황상현;이진영
    • 아태비즈니스연구
    • /
    • 제10권4호
    • /
    • pp.137-151
    • /
    • 2019
  • Using Korean Labor and Income Panel Study data, this paper estimates nominal wage rigidity in Korea by industry from 2005 to 2017 and evaluates the level of inefficiency of Korean labor market. And, after estimating employment volatility by industry using the Labor Force Survey at Establishments data for Korea, we combine the nominal wage rigidity and the employment volatility estimates and analyze the effect of nominal wage rigidity on employment volatility in Korea from 2011 to 2017. If the level of wage rigidity is high, it may be hard for the labor market to be in the equilibrium, and therefore, the market may have inefficiency. We find that the inefficiency of the labor market in Korea have increased from 2005 to 2017 and the industry of accommodation and food service activities has the highest level of inefficiency over the period. We also find that one-percent-point increase in wage rigidity increases employment volatility by 2.3-2.9 percent and the positive effect is bigger for workers with part-time and temporary jobs. The result implies that firms may adjust their labor costs by changing the number of casual workers, rather than permanent workers, when the labor market suffers from a high level of wage rigidity.

자원공유 수단으로서의 전문 데이터베이스 (Full-text databases as a means for resource sharing)

  • 노진구
    • 한국도서관정보학회지
    • /
    • 제24권
    • /
    • pp.45-79
    • /
    • 1996
  • Rising publication costs and declining financial resources have resulted in renewed interest among librarians in resource sharing. Although the idea of sharing resources is not new, there is a sense of urgency not seen in the past. Driven by rising publication costs and static and often shrinking budgets, librarians are embracing resource sharing as an idea whose time may finally have come. Resource sharing in electronic environments is creating a shift in the concept of the library as a warehouse of print-based collection to the idea of the library as the point of access to need information. Much of the library's material will be delivered in electronic form, or printed. In this new paradigm libraries can not be expected to su n.0, pport research from their own collections. These changes, along with improved communications, computerization of administrative functions, fax and digital delivery of articles, advancement of data storage technologies, are improving the procedures and means for delivering needed information to library users. In short, for resource sharing to be truly effective and efficient, however, automation and data communication are essential. The possibility of using full-text online databases as a su n.0, pplement to interlibrary loan for document delivery is examined. At this point, this article presents possibility of using full-text online databases as a means to interlibrary loan for document delivery. The findings of the study can be summarized as follows : First, turn-around time and the cost of getting a hard copy of a journal article from online full-text databases was comparable to the other document delivery services. Second, the use of full-text online databases should be considered as a method for promoting interlibrary loan services, as it is more cost-effective and labour saving. Third, for full-text databases to work as a document delivery system the databases must contain as many periodicals as possible and be loaded on as many systems as possible. Forth, to contain many scholarly research journals on full-text databases, we need guidelines to cover electronic document delivery, electronic reserves. Fifth, to be a full full-text database, more advanced information technologies are really needed.

  • PDF

Evolutionary Optimization of Pulp Digester Process Using D-optimal DOE and RSM

  • Chu, Young-Hwan;Chonghun Han
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2000년도 제15차 학술회의논문집
    • /
    • pp.395-395
    • /
    • 2000
  • Optimization of existing processes becomes more important than the past as environmental problems and concerns about energy savings stand out. When we can model a process mathematically, we can easily optimize it by using the model as constraints. However, modeling is very difficult for most chemical processes as they include numerous units together with their correlation and we can hardly obtain parameters. Therefore, optimization that is based on the process models is, in turn, hard to perform. Especially, f3r unknown processes, such as bioprocess or microelectronics materials process, optimization using mathematical model (first principle model) is nearly impossible, as we cannot understand the inside mechanism. Consequently, we propose a few optimization method using empirical model evolutionarily instead of mathematical model. In this method, firstly, designing experiments is executed fur removing unecessary experiments. D-optimal DOE is the most developed one among DOEs. It calculates design points so as to minimize the parameters variances of empirical model. Experiments must be performed in order to see the causation between input variables and output variables as only correlation structure can be detected in historical data. And then, using data generated by experiments, empirical model, i.e. response surface is built by PLS or MLR. Now, as process model is constructed, it is used as objective function for optimization. As the optimum point is a local one. above procedures are repeated while moving to a new experiment region fur finding the global optimum point. As a result of application to the pulp digester benchmark model, kappa number that is an indication fur impurity contents decreased to very low value, 3.0394 from 29.7091. From the result, we can see that the proposed methodology has sufficient good performance fur optimization, and is also applicable to real processes.

  • PDF

High-revenue Online Provisioning for Virtual Clusters in Multi-tenant Cloud Data Center Network

  • Lu, Shuaibing;Fang, Zhiyi;Wu, Jie
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제13권3호
    • /
    • pp.1164-1183
    • /
    • 2019
  • The rapid development of cloud computing and high requirements of operators requires strong support from the underlying Data Center Networks. Therefore, the effectiveness of using resources in the data center networks becomes a point of concern for operators and material for research. In this paper, we discuss the online virtual-cluster provision problem for multiple tenants with an aim to decide when and where the virtual cluster should be placed in a data center network. Our objective is maximizing the total revenue for the data center networks under the constraints. In order to solve this problem, this paper divides it into two parts: online multi-tenancy scheduling and virtual cluster placement. The first part aims to determine the scheduling orders for the multiple tenants, and the second part aims to determine the locations of virtual machines. We first approach the problem by using the variational inequality model and discuss the existence of the optimal solution. After that, we prove that provisioning virtual clusters for a multi-tenant data center network that maximizes revenue is NP-hard. Due to the complexity of this problem, an efficient heuristic algorithm OMS (Online Multi-tenancy Scheduling) is proposed to solve the online multi-tenancy scheduling problem. We further explore the virtual cluster placement problem based on the OMS and propose a novel algorithm during the virtual machine placement. We evaluate our algorithms through a series of simulations, and the simulations results demonstrate that OMS can significantly increase the efficiency and total revenue for the data centers.