• Title/Summary/Keyword: new approaches

Search Result 2,711, Processing Time 0.028 seconds

A Goodness of Fit Approach to Major Lifetesting Problems

  • Ahmad, Ibrahim A.;Alwasel, Ibrahim A.;Mugdadi, A.R.
    • International Journal of Reliability and Applications
    • /
    • v.2 no.2
    • /
    • pp.81-97
    • /
    • 2001
  • Lifetesting problems have been the subject of investigations for over three decades. Most suggested approaches are markedly different from those used in the related but wider goodness of fit problems. In the current investigation, it is demonstrated that a goodness of fit approach is possible in many lifetesting problems and that It results in simpler procedures that are asymptotically equivalent or better than standard ones. They may also have superior finite sample behavior. Several perennial classes are addressed here. The class of increasing failure rate (IFR) and the class of new better than used (NBU) are addressed first. In addition, we provide testing for a newer and practical class of new better than used in convex ordering (NBUC) due to Cao and Wang (1991). Other classes can be developed similarly and this point is illustrated with the classes of new better than used in expectation (NBUE) and harmonic new better than used in expectation (HNBUE).

  • PDF

Software Climate Change and its Disruptive Weather: A Potential Shift from "Software Engineering" to Vibrant/Dynamic Softology

  • Ghani, Imran;Jeong, Seung Ryul
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.10 no.8
    • /
    • pp.3925-3942
    • /
    • 2016
  • Like natural climate change on the planet earth, the climate in software development environments is also changing (fast). Like the natural weather, the software environment is also disruptive. As the climate experts alert and suggest taking necessary measures to overcome certain challenges to make this earth a safer and comfortable living place, likewise this article also alerts the relevant stakeholders of software craftsmanship about the dynamic challenges that traditional Software Engineering (SE) with purely "Engineering mind-set" is not capable to respond. Hence, some new thoughts to overcome such challenges are shared. Fundamentally, based on the historical evidences, this article presents the authors' observation about continuous shift from traditional "Engineering-based" software development approaches to disruptive approaches - "Vibrant Softology". The authors see the cause of this shift as disruptive transformational force, which is so powerful that it is uncontrollably diminishing the "Engineering-based" approach from software development environments. The authors align it with climate change analogy. Based on this analogy, the authors feel the need to theoretically re-coin the notion of SE to some new term; perhaps Vibrant/Dynamic Softology (VS or DS). Hence, the authors suggest "a new (disruptive and dynamic) way of thinking is required to develop software". It is worth mentioning that the purpose of article and this new theory is not to disparage the notion of software engineering altogether, rather the aim is to highlight the importance of transformation from SE to its next level (perhaps VS/DS) due to the emerging needs in the software craftsmanship environment.

Spectro-Temporal Filtering Based on Soft Decision for Stereophonic Acoustic Echo Suppression (스테레오 음향학적 에코 제거를 위한 Soft Decision 기반 필터 확장 기법)

  • Lee, Chul Min;Bae, Soo Hyun;Kim, Jeung Hun;Kim, Nam Soo
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.39C no.12
    • /
    • pp.1346-1351
    • /
    • 2014
  • We propose a novel approach for stereophonic acoustic echo suppression using spectro-temporal filtering based on soft decision. Unlike the conventional approaches estimating the echo pathes directly, the proposed technique can estimate stereo echo spectra without any double-talk detector. In order to improve the estimation of echo spectra, the extended power spectrum density matrix and echo overestimation control matrix are applied on this method. In addition, this echo suppression technique is based on soft decision technique using speech absence probability in STFT domain. Experimental results show that the proposed method improves compared with the conventional approaches.

The Convergence of Habermas' Communicative Action Theory and Public Relations (하버마스 의사소통 합리성과 PR커뮤니케이션 의미의 확장)

  • Kim, Yung-Wook
    • Korean journal of communication and information
    • /
    • v.30
    • /
    • pp.89-119
    • /
    • 2005
  • The purpose of this essay is to converge the theory of communicative action Into the new paradigm of 'public relations democracy.' The notions of communicative action rationality, the public sphere, and deliberative democracy led new public relations paradigm approaches including meaning sharing, media access enlargement, and theoretical ramifications for the powerless. As Habermas prospected the power of comprehensive rationality to solve post-capitalist problems, the paradigm of public relations democracy visions the new era of public relations equipped with rhetorical and critical approaches. The new paradigm tries to overcome functional fallacy and embraces the concept of public interest. The paradigm of public relations democracy aims at integrating all three levels of public relations activities such as individual, organizational, and social levels, and pursues to enlarge the public sphere through increasing communicative actions and resolving social conflicts. Habermas's critical theory exhibits an opportunity for public relations theory building.

  • PDF

Ground-Motion Prediction Equations based on refined data for dynamic time-history analysis

  • Moghaddam, Salar Arian;Ghafory-Ashtiany, Mohsen;Soghrat, Mohammadreza
    • Earthquakes and Structures
    • /
    • v.11 no.5
    • /
    • pp.779-807
    • /
    • 2016
  • Ground Motion Prediction Equations (GMPEs) are essential tools in seismic hazard analysis. With the introduction of probabilistic approaches for the estimation of seismic response of structures, also known as, performance based earthquake engineering framework; new tasks are defined for response spectrum such as the reference criterion for effective structure-specific selection of ground motions for nonlinear time history analysis. One of the recent efforts to introduce a high quality databank of ground motions besides the corresponding selection scheme based on the broadband spectral consistency is the development of SIMBAD (Selected Input Motions for displacement-Based Assessment and Design), which is designed to improve the reliability of spectral values at all natural periods by removing noise with modern proposed approaches. In this paper, a new global GMPE is proposed by using selected ground motions from SIMBAD to improve the reliability of computed spectral shape indicators. To determine regression coefficients, 204 pairs of horizontal components from 35 earthquakes with magnitude ranging from Mw 5 to Mw 7.1 and epicentral distances lower than 40 km selected from SIMBAD are used. The proposed equation is compared with similar models both qualitatively and quantitatively. After the verification of model by several goodness-of-fit measures, the epsilon values as the spectral shape indicator are computed and the validity of available prediction equations for correlation of the pairs of epsilon values is examined. General consistency between predictions by new model and others, especially, in short periods is confirmed, while, at longer periods, there are meaningful differences between normalized residuals and correlation coefficients between pairs of them estimated by new model and those are computed by other empirical equations. A simple collapse assessment example indicate possible improvement in the correlation between collapse capacity and spectral shape indicators (${\varepsilon}$) up to 20% by selection of a more applicable GMPE for calculation of ${\varepsilon}$.

A Study on the New RFP for the PMO-based Development of Next-Generation MIS (PMO 기반의 차세대 MIS개발을 위한 신RFP 작성 연구)

  • Han, Moonhee;Kim, Yousin;Kim, Dae Ho
    • Asia-pacific Journal of Multimedia Services Convergent with Art, Humanities, and Sociology
    • /
    • v.6 no.11
    • /
    • pp.585-594
    • /
    • 2016
  • Information system development projects are increasingly enlarging in size and are becoming more complex in the development environment. The traditional approaches for the development of information system, however, haven't shown the expected performances in the management for them and many researchers have been trying to find a new approach for overcoming this challenges. And one of the new approaches is the PMO and it is introduced to solve the problems of project management and the related organizational, technical, and administrative problems. This study attempted to find the efficient information system establishment plan. In particular, this study wants to research the practicality of PMO and the detailed utilization plans. As a result, this study proposes the structure of project implementation, the requirements, the direction of the basic system design, the scope of system development, and the budgeting for the systems. It is expected to contribute the successful implementation of the project.

Comparison of Deterministic and Probabilistic Approaches through Cases of Exposure Assessment of Child Products (어린이용품 노출평가 연구에서의 결정론적 및 확률론적 방법론 사용실태 분석 및 고찰)

  • Jang, Bo Youn;Jeong, Da-In;Lee, Hunjoo
    • Journal of Environmental Health Sciences
    • /
    • v.43 no.3
    • /
    • pp.223-232
    • /
    • 2017
  • Objectives: In response to increased interest in the safety of children's products, a risk management system is being prepared through exposure assessment of hazardous chemicals. To estimate exposure levels, risk assessors are using deterministic and probabilistic approaches to statistical methodology and a commercialized Monte Carlo simulation based on tools (MCTool) to efficiently support calculation of the probability density functions. This study was conducted to analyze and discuss the usage patterns and problems associated with the results of these two approaches and MCTools used in the case of probabilistic approaches by reviewing research reports related to exposure assessment for children's products. Methods: We collected six research reports on exposure and risk assessment of children's products and summarized the deterministic results and corresponding underlying distributions for exposure dose and concentration results estimated through deterministic and probabilistic approaches. We focused on mechanisms and differences in the MCTools used for decision making with probabilistic distributions to validate the simulation adequacy in detail. Results: The estimation results of exposure dose and concentration from the deterministic approaches were 0.19-3.98 times higher than the results from the probabilistic approach. For the probabilistic approach, the use of lognormal, Student's T, and Weibull distributions had the highest frequency as underlying distributions of the input parameters. However, we could not examine the reasons for the selection of each distribution because of the absence of test-statistics. In addition, there were some cases estimating the discrete probability distribution model as the underlying distribution for continuous variables, such as weight. To find the cause of abnormal simulations, we applied two MCTools used for all reports and described the improper usage routes of MCTools. Conclusions: For transparent and realistic exposure assessment, it is necessary to 1) establish standardized guidelines for the proper use of the two statistical approaches, including notes by MCTool and 2) consider the development of a new software tool with proper configurations and features specialized for risk assessment. Such guidelines and software will make exposure assessment more user-friendly, consistent, and rapid in the future.

Introducing SEABOT: Methodological Quests in Southeast Asian Studies

  • Keck, Stephen
    • SUVANNABHUMI
    • /
    • v.10 no.2
    • /
    • pp.181-213
    • /
    • 2018
  • How to study Southeast Asia (SEA)? The need to explore and identify methodologies for studying SEA are inherent in its multifaceted subject matter. At a minimum, the region's rich cultural diversity inhibits both the articulation of decisive defining characteristics and the training of scholars who can write with confidence beyond their specialisms. Consequently, the challenges of understanding the region remain and a consensus regarding the most effective approaches to studying its history, identity and future seem quite unlikely. Furthermore, "Area Studies" more generally, has proved to be a less attractive frame of reference for burgeoning scholarly trends. This paper will propose a new tool to help address these challenges. Even though the science of artificial intelligence (AI) is in its infancy, it has already yielded new approaches to many commercial, scientific and humanistic questions. At this point, AI has been used to produce news, generate better smart phones, deliver more entertainment choices, analyze earthquakes and write fiction. The time has come to explore the possibility that AI can be put at the service of the study of SEA. The paper intends to lay out what would be required to develop SEABOT. This instrument might exist as a robot on the web which might be called upon to make the study of SEA both broader and more comprehensive. The discussion will explore the financial resources, ownership and timeline needed to make SEABOT go from an idea to a reality. SEABOT would draw upon artificial neural networks (ANNs) to mine the region's "Big Data", while synthesizing the information to form new and useful perspectives on SEA. Overcoming significant language issues, applying multidisciplinary methods and drawing upon new yields of information should produce new questions and ways to conceptualize SEA. SEABOT could lead to findings which might not otherwise be achieved. SEABOT's work might well produce outcomes which could open up solutions to immediate regional problems, provide ASEAN planners with new resources and make it possible to eventually define and capitalize on SEA's "soft power". That is, new findings should provide the basis for ASEAN diplomats and policy-makers to develop new modalities of cultural diplomacy and improved governance. Last, SEABOT might also open up avenues to tell the SEA story in new distinctive ways. SEABOT is seen as a heuristic device to explore the results which this instrument might yield. More important the discussion will also raise the possibility that an AI-driven perspective on SEA may prove to be even more problematic than it is beneficial.

  • PDF

On-line Trace Based Automatic Parallelization of Java Programs on Multicore Platforms

  • Sun, Yu;Zhang, Wei
    • Journal of Computing Science and Engineering
    • /
    • v.6 no.2
    • /
    • pp.105-118
    • /
    • 2012
  • We propose two new approaches that automatically parallelize Java programs at runtime. These approaches, which rely on run-time trace information collected during program execution, dynamically recompile Java byte code that can be executed in parallel. One approach utilizes trace information to improve traditional loop parallelization, and the other parallelizes traces instead of loop iterations. We also describe a cost/benefit model that makes intelligent parallelization decisions, as well as a parallel execution environment to execute parallelized programs. These techniques are based on Jikes RVM. Our approach is evaluated by parallelizing sequential Java programs, and its performance is compared to that of the manually parallelized code. According to the experimental results, our approach has low overheads and achieves competitive speedups compared to the manually parallelizing code. Moreover, trace parallelization can exploit parallelism beyond loop iterations.

Adaptive Channel Normalization Based on Infomax Algorithm for Robust Speech Recognition

  • Jung, Ho-Young
    • ETRI Journal
    • /
    • v.29 no.3
    • /
    • pp.300-304
    • /
    • 2007
  • This paper proposes a new data-driven method for high-pass approaches, which suppresses slow-varying noise components. Conventional high-pass approaches are based on the idea of decorrelating the feature vector sequence, and are trying for adaptability to various conditions. The proposed method is based on temporal local decorrelation using the information-maximization theory for each utterance. This is performed on an utterance-by-utterance basis, which provides an adaptive channel normalization filter for each condition. The performance of the proposed method is evaluated by isolated-word recognition experiments with channel distortion. Experimental results show that the proposed method yields outstanding improvement for channel-distorted speech recognition.

  • PDF