نتایج جستجو برای: Memory de-duplication

تعداد نتایج: 1802836  

2016
M. SWARNA

IJRAET Abstract— To eliminate duplicate copies of data we use data de-duplication process. As well as it is used in cloud storage to minimize memory space and upload bandwidth only one copy for every file stored in cloud that can be used by more number of users. Deduplication process helps to improve storage space. Another challenge of privacy for sensitive data also arises. The aim of this pap...

In cloud computing, multiple users can share the same physical machine that can potentially leak secret information, in particular when the memory de-duplication is enabled. Flush+Reload attack is a cache-based attack that makes use of resource sharing. T-table implementation of AES is commonly used in the crypto libraries like OpenSSL. Several Flush+Reload attacks on T-table implementat...

De novo duplication of 2q is very rare. Most cases of 2q duplications result from familial translocations, and are associated with simultaneous monosomy of another chromosome segment. To our knowledge and search in English literature there are less than 20 reported cases of isolated 2q duplication. Hereby we introduce a 4.5-year-old Iranian boy of a non-consanguineous marriage who was referred ...

Journal: :Shanghai Ligong Daxue xuebao 2021

Data deduplication is a crucial technique for packing data and reducing duplication when transferring data. It widely used in the cloud to restrict usage of capacity memory aids transmission sparing. Before redistributing data, an encryption mechanism ensure integrity sensitive during process. The SHA algorithm being save text format. To generate security bits, padding appended text. In de-dupl...

2011
Atish Kathpal Matthew John Gaurav Makkar

Data De-duplication is essentially a data compression technique for elimination of coarse-grained redundant data. A typical flavor of de-duplication detects duplicate data blocks within the storage device and de-duplicates them by placing pointers rather than storing multiple copies at various places within the disk. Since the advent of deduplication the conventional approach has been to scale-...

Journal: :بینا 0
عباس باقری a bagheri ocular tissue engineering research center, shahid beheshti university of medical sciences, tehran, iranمرکز تحقیقات مهندسی بافت چشم- دانشگاه علوم پزشکی شهید بهشتی- تهران- ایران رضا جعفری r jafari mazandaran university of medical sciences, sari, iran; 3ophthalmic research center, shahid beheshti university of medical sciences, tehran, iranدانشگاه علوم پزشکی مازندران- ساری- ایران محدثه فیضی m feizi mazandaran university of medical sciences, sari, iran; 3ophthalmic research center, shahid beheshti university of medical sciences, tehran, iranمرکز تحقیقات چشم- دانشگاه علوم پزشکی شهید بهشتی- تهران- ایران

purpose: to report a case who had optic disc duplication, a rare congenital disorder characterized by two well-defined discs in one eye. case report: a 19 months-old child presented with unilateral epiphora in the right eye since birthday. the right eye was smaller than the left eye and mild ptosis was apparent. nasolacrimal duct probing was performed under general anesthesia. the examination r...

2015
Hui Liu Jie Song Yu-Bin Bao

Data quality of the data warehouse is crucial to decision-makers. Data duplication is considered one of the critical factors that affect the data quality. Therefore, data de-duplication is an essential process for data warehousing. Particularly, for a real-time data warehouse, it is necessary to ensure not only the data quality in real-time, but also the performance of the front-end queries and...

2016
J. Li X. Chen M. Li P. Lee

One very important challenges of today’s cloud storage services is that the management of the ever-increasing amount of data. To create data management scalable, de-duplication has been a widely known technique to condense space for storing and transfer information measure in cloud storage. Rather than keeping multiple data copies with an equivalent content, de-duplication eliminates redundant ...

2014
N. Patil C. R. Barde

As per current trends users prefer cloud to store their personnel as well as data which they want to share with other. In case of such data storage system some time same type of data is stored by different users. This data duplication causes inefficiency in cloud storage as well as utilization of bandwidth. To make cloud more sensible regarding its storage and bandwidth some techniques are prop...

2015
Priyanka N. Patil C. R. Barde

In regards to increase in use of digital information users prefer to store information in cloud system. In cloud storage system many users can store same type of data leading to data duplication causing a high utilization of bandwidth. Some techniques are proposed for making cloud more efficient and effective regarding to storage and bandwidth. In current time data de-duplication is effective t...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید