• Title/Summary/Keyword: memory organizations

Search Result 39, Processing Time 0.025 seconds

The role of positive affect in virtual collaboration: a transactive memory system perspective

  • Chae, Seong Wook
    • Journal of the Korea Society of Computer and Information
    • /
    • v.21 no.5
    • /
    • pp.99-109
    • /
    • 2016
  • Creative performance has been regarded as the key to the success of an organization in recent years, and is considered essential for the survival of an organization. Organizations must find and develop creative solutions to deal with a variety of business issues. How can organizations become more creative? To develop creativity, organizations must make it easier to connect the knowledge and perspectives of its various members, who may be scattered around the world, by developing a virtual team. Drawing from the transactive memory systems (TMS), which include expertise location, credibility, and coordination, this study investigates how the positive affect of team members influences the development of creative performance during virtual collaboration where face-to-face team activities are limited. The proposed structured model was empirically tested with cross-sectional data from 322 individuals. Results indicated that the positive affect of team members was found to moderate the relationship between TMS and creativity. Through this study, we expect to provide an understanding of the mechanisms involved in developing creativity among team members in a virtual work environment.

Fuel Consumption Prediction and Life Cycle History Management System Using Historical Data of Agricultural Machinery

  • Jung Seung Lee;Soo Kyung Kim
    • Journal of Information Technology Applications and Management
    • /
    • v.29 no.5
    • /
    • pp.27-37
    • /
    • 2022
  • This study intends to link agricultural machine history data with related organizations or collect them through IoT sensors, receive input from agricultural machine users and managers, and analyze them through AI algorithms. Through this, the goal is to track and manage the history data throughout all stages of production, purchase, operation, and disposal of agricultural machinery. First, LSTM (Long Short-Term Memory) is used to estimate oil consumption and recommend maintenance from historical data of agricultural machines such as tractors and combines, and C-LSTM (Convolution Long Short-Term Memory) is used to diagnose and determine failures. Memory) to build a deep learning algorithm. Second, in order to collect historical data of agricultural machinery, IoT sensors including GPS module, gyro sensor, acceleration sensor, and temperature and humidity sensor are attached to agricultural machinery to automatically collect data. Third, event-type data such as agricultural machine production, purchase, and disposal are automatically collected from related organizations to design an interface that can integrate the entire life cycle history data and collect data through this.

Analysis on the GPU Performance according to Hierarchical Memory Organization (계층적 메모리 구성에 따른 GPU 성능 분석)

  • Choi, Hongjun;Kim, Jongmyon;Kim, Cheolhong
    • The Journal of the Korea Contents Association
    • /
    • v.14 no.3
    • /
    • pp.22-32
    • /
    • 2014
  • Recently, GPGPU has been widely used for general-purpose processing as well as graphics processing by providing optimized hardware for parallel processing. Memory system has big effects on the performance of parallel processing units such as GPU. In the GPU, hierarchical memory architecture is implemented for high memory bandwidth. Moreover, both memory address coalescing and memory request merging techniques are widely used. This paper analyzes the GPU performance according to various memory organizations. According to our simulation results, GPU performance improves by 15.5%, 21.5%, 25.5%, 30.9% as adding 8KB L1, 16KB L1, 32KB L1, 64KB L1 cache, respectively, compared to case without L1 cache. However, experimental results show that some benchmarks decrease performance since memory transaction increases due to data dependency. Moreover, average memory access latency is increased as the depth of hierarchical cache level increases when cache miss occurs significantly.

Meaning of Memory in Archival Activism (기억의 기록학적 의미와 실천)

  • Seol, Moon-won
    • The Korean Journal of Archival Studies
    • /
    • no.67
    • /
    • pp.267-318
    • /
    • 2021
  • The purpose of this study is to analyze how the "memory approach" has affected archival methodology and activities, and suggest the directions of archival activities in each field. Although there have been many discussions on the memories and collective memories in Archival Studies, it is necessary to analyze them more practically from the viewpoint of archival activism. In this study, the memory approaches in archival discourse are classified into four categories in terms of archival activism; i) the role of archives as social memory organizations, ii) the memory struggle for finding out the truth of the past, iii) archival activities of restorative justice for people who suffer from trauma memories after social disasters and human rights violations, and iv) the memory process of communities' archiving for strengthening community identities. The meaning and issues are analyzed for each category, and the practice based on archival expertise and political and social practices are examined together as necessary competencies for archival activism.

The Effects of Cache Memory on the System Bus Traffic (캐쉬 메모리가 버스 트래픽에 끼치는 영향)

  • 조용훈;김정선
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.21 no.1
    • /
    • pp.224-240
    • /
    • 1996
  • It is common sense for at least one or more levels of cache memory to be used in these day's computer systems. In this paper, the impact of the internal cache memory organization on the performance of the computer is investigated by using a simulator program, which is wirtten by authors and run on SUN SPARC workstation, with several real execution, with several real execution trace files. 280 cache organizations have been simulated using n-way set associative mapping and LRU(Least Recently Used) replacement algorithm with write allocation policy. As a result, 16-way setassociative cache is the best configuration, and when we select 256KB cache memory and 64 byte line size, the bus traffic ratio was decreased compared to that of the noncache system so that a single bus could support almost 7 processors without any delay and degradationof high ratio(hit ratio was 99.21%). The smaller the line size we choose, the little lower hit ratio we can get, but the more processors can be supported by a single bus(maximum 18 processors). Therefore, using a proper cache memory organization can make a single bus structure be able to support multiple processors without any performance degradation.

  • PDF

Archival Description and Records from Historically Marginalized Cultures: A View from a Postmodern Window

  • Sinn, Dong-Hee
    • Journal of the Korean Society for Library and Information Science
    • /
    • v.44 no.4
    • /
    • pp.115-130
    • /
    • 2010
  • In the archival field, the last decade has witnessed much discussion on archives' broad responsibilities for social memory. Considering that the social role of archives has stemmed from postmodern thinking suggests a paradigm shift from viewing archives as static recorded objects to viewing them as dynamic evidence of human memory. The modern archives and archivists are products of nineteenth-century positivism, limiting their function to archiving written documents within stable organizations. The new thoughts on the social role of archives provide a chance to realize that traditional archival practices have preserved only a sliver of organizational memory, thus ignoring fluid records of human activities and memory. Archival description is the primary method for users to access materials in archives. Thus, it can determine how archival materials will be used (or not used). The traditional archival description works as the representation of archival materials and is directly projected from the hierarchy of organizational documents. This paper argues that archivists will need to redefine archival description to be more sensitive to atypical types of archival materials from various cultural contexts. This paper surveys the postmodern approaches to archival concepts in relation to descriptive practices. It also examines some issues related to representing historically marginalized groups in archival description who were previously neglected in traditional archival practices.

Time Series Crime Prediction Using a Federated Machine Learning Model

  • Salam, Mustafa Abdul;Taha, Sanaa;Ramadan, Mohamed
    • International Journal of Computer Science & Network Security
    • /
    • v.22 no.4
    • /
    • pp.119-130
    • /
    • 2022
  • Crime is a common social problem that affects the quality of life. As the number of crimes increases, it is necessary to build a model to predict the number of crimes that may occur in a given period, identify the characteristics of a person who may commit a particular crime, and identify places where a particular crime may occur. Data privacy is the main challenge that organizations face when building this type of predictive models. Federated learning (FL) is a promising approach that overcomes data security and privacy challenges, as it enables organizations to build a machine learning model based on distributed datasets without sharing raw data or violating data privacy. In this paper, a federated long short- term memory (LSTM) model is proposed and compared with a traditional LSTM model. Proposed model is developed using TensorFlow Federated (TFF) and the Keras API to predict the number of crimes. The proposed model is applied on the Boston crime dataset. The proposed model's parameters are fine tuned to obtain minimum loss and maximum accuracy. The proposed federated LSTM model is compared with the traditional LSTM model and found that the federated LSTM model achieved lower loss, better accuracy, and higher training time than the traditional LSTM model.

A Case Study on the Documentation in a Disaster Area - On the Basis of Great Hanshin-Awaji Earthquake - (재해 지역의 기록화 사례 연구 - 한신·아와지 대지진 기록관리 사례를 중심으로 -)

  • Lee, Mi-young
    • The Korean Journal of Archival Studies
    • /
    • no.21
    • /
    • pp.85-116
    • /
    • 2009
  • Records are the social memory storage including collective memory about region, it is impossible to put into and represent all aspects of society with only public records. Japan showed the possibilities of cooperative collecting and the positive records producing and collecting efforts between organizations, even if it was not accomplished by systemic documentation strategy. Some characteristics were found out when this case was reviewed, it is as follows. To begin with, it is the cooperation and share in collecting. Especially, the activities of private organizations look better than public organizations and the cooperative collecting efforts lead to transmit much more social memory and historical records to next generation. Secondly, it is the positive records producing and collecting. The private organizations also left many records of various activities of those one accord. They recognized that recording experience and leaving behind is the survivors' responsibility. We cannot help recognizing the growth of a sense of duty and historical consciousness to record their own experience with undergoing big disaster, earthquake. Thirdly, there was no limit when it comes to the collecting scope. All records related with people and place in disaster area were the target for collecting just like slogan, 'Let's transmit records to next generation as much as possible', 'Common records and resources deserve leaving, because it is important life's information proving situations of the time. We were able to confirm the high will and enthusiasm about 'how, what and why do we transmit something of society to next generation' in this case.

A study on Influencing Factors of Knowledge Creation focus on Transactive Memory (지식성장의 영향요인에 관한 연구-분산기억중심으로)

  • Liu, Chang;Kim, Sang Wook
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.16 no.2
    • /
    • pp.1073-1083
    • /
    • 2015
  • As an empirical study focused on how transactive memory concept influence the four processes of knowledge growth, this study aimed at explaining the function of transactive memory in the dynamic process of knowledge growth. To verify the research model, this study carried out a path analysis of 130 team memebers and as a result, part of the team transactive memory measurements does affect the knowledge growth. Consequentially, transactive memory especially based on team level, is relevant with knowledge growth. According to team level, this study targeted at the team members, and by adopting the team transactive memory concept in Cognitive Psycology, theoretically explained and analysed how to approach personal knowledge in internal organizations. To accelerate the knowledge process, the work should be modified collaboratively by trusting the team members' duty relations more than specialized knowledge. Furthermore, managers had better assign team members the work where they can make the most of their personal knowledge, and this study presented that the whole team performance could be improved by doing that.

Understanding the Mismatch between ERP and Organizational Information Needs and Its Responses: A Study based on Organizational Memory Theory (조직의 정보 니즈와 ERP 기능과의 불일치 및 그 대응책에 대한 이해: 조직 메모리 이론을 바탕으로)

  • Jeong, Seung-Ryul;Bae, Uk-Ho
    • Asia pacific journal of information systems
    • /
    • v.22 no.2
    • /
    • pp.21-38
    • /
    • 2012
  • Until recently, successful implementation of ERP systems has been a popular topic among ERP researchers, who have attempted to identify its various contributing factors. None of these efforts, however, explicitly recognize the need to identify disparities that can exist between organizational information requirements and ERP systems. Since ERP systems are in fact "packages" -that is, software programs developed by independent software vendors for sale to organizations that use them-they are designed to meet the general needs of numerous organizations, rather than the unique needs of a particular organization, as is the case with custom-developed software. By adopting standard packages, organizations can substantially reduce many of the potential implementation risks commonly associated with custom-developed software. However, it is also true that the nature of the package itself could be a risk factor as the features and functions of the ERP systems may not completely comply with a particular organization's informational requirements. In this study, based on the organizational memory mismatch perspective that was derived from organizational memory theory and cognitive dissonance theory, we define the nature of disparities, which we call "mismatches," and propose that the mismatch between organizational information requirements and ERP systems is one of the primary determinants in the successful implementation of ERP systems. Furthermore, we suggest that customization efforts as a coping strategy for mismatches can play a significant role in increasing the possibilities of success. In order to examine the contention we propose in this study, we employed a survey-based field study of ERP project team members, resulting in a total of 77 responses. The results of this study show that, as anticipated from the organizational memory mismatch perspective, the mismatch between organizational information requirements and ERP systems makes a significantly negative impact on the implementation success of ERP systems. This finding confirms our hypothesis that the more mismatch there is, the more difficult successful ERP implementation is, and thus requires more attention to be drawn to mismatch as a major failure source in ERP implementation. This study also found that as a coping strategy on mismatch, the effects of customization are significant. In other words, utilizing the appropriate customization method could lead to the implementation success of ERP systems. This is somewhat interesting because it runs counter to the argument of some literature and ERP vendors that minimized customization (or even the lack thereof) is required for successful ERP implementation. In many ERP projects, there is a tendency among ERP developers to adopt default ERP functions without any customization, adhering to the slogan of "the introduction of best practices." However, this study asserts that we cannot expect successful implementation if we don't attempt to customize ERP systems when mismatches exist. For a more detailed analysis, we identified three types of mismatches-Non-ERP, Non-Procedure, and Hybrid. Among these, only Non-ERP mismatches (a situation in which ERP systems cannot support the existing information needs that are currently fulfilled) were found to have a direct influence on the implementation of ERP systems. Neither Non-Procedure nor Hybrid mismatches were found to have significant impact in the ERP context. These findings provide meaningful insights since they could serve as the basis for discussing how the ERP implementation process should be defined and what activities should be included in the implementation process. They show that ERP developers may not want to include organizational (or business processes) changes in the implementation process, suggesting that doing so could lead to failed implementation. And in fact, this suggestion eventually turned out to be true when we found that the application of process customization led to higher possibilities of failure. From these discussions, we are convinced that Non-ERP is the only type of mismatch we need to focus on during the implementation process, implying that organizational changes must be made before, rather than during, the implementation process. Finally, this study found that among the various customization approaches, bolt-on development methods in particular seemed to have significantly positive effects. Interestingly again, this finding is not in the same line of thought as that of the vendors in the ERP industry. The vendors' recommendations are to apply as many best practices as possible, thereby resulting in the minimization of customization and utilization of bolt-on development methods. They particularly advise against changing the source code and rather recommend employing, when necessary, the method of programming additional software code using the computer language of the vendor. As previously stated, however, our study found active customization, especially bolt-on development methods, to have positive effects on ERP, and found source code changes in particular to have the most significant effects. Moreover, our study found programming additional software to be ineffective, suggesting there is much difference between ERP developers and vendors in viewpoints and strategies toward ERP customization. In summary, mismatches are inherent in the ERP implementation context and play an important role in determining its success. Considering the significance of mismatches, this study proposes a new model for successful ERP implementation, developed from the organizational memory mismatch perspective, and provides many insights by empirically confirming the model's usefulness.

  • PDF