Ana içeriğe atla

Kayıtlar

Ağustos, 2017 tarihine ait yayınlar gösteriliyor

KÖRLER ÜLKESİNE KRAL OLMAK

Dere tepe, dağ taş dolaşmayı çok seven tek gözlü bir adam varmış. Yürür yürür gider, gider gider yürürmüş. Bir gün uzaklarda renkleri karmakarışık bir köy görmüş; alacalı bulacalı garip bir köy. Yaklaşmış köye doğru. Yolları bir tuhaf, evleri bir tuhaf, insanları bir tuhafmış köyün. Köyün içine girince anlamış meseleyi. Körler köyüymüş burası. Kadınların, erkeklerin, çocukların velhasıl herkesin sımsıkı kapalıymış gözleri. Gezgin tek gözlü adam karar vermiş burada yaşamaya. “Hiç değilse benim tek gözüm var” diyormuş. “Körler ülkesinde şaşılar kral olur derler. Ben de bunların başına geçer yaşarım” Körlerin gözleri yokmuş ama elleri, kulakları, burunları çok hassasmış. Kendilerine göre kurdukları bir düzen içinde yuvarlanıp gidiyorlarmış. Adam şaşkın hallerine bakıyormuş onların. Yürümeleri, konuşmaları doğrusu başka türlüymüş. Bir gün körlerden biri ötekilerden birinin malını çalmış. Sadece tek gözlü adam görmüş bunu. Bağırarak ilan etmiş “filanca falancanın malını çaldııı”

Network Data Security System Design with High Security Insurance

Design Introduction Because of the availability of relatively large bandwidth, computers worldwide are connected via a network. Computer users can transfer information and exchange data easily on the Internet, and as a result, many network data servers have been set up. With this growth, ensuring data security during storage and transmission has become very important. In our project, a channel coding and encryption system protects data, ensuring data security in case of network invalidation. It also prevents data loss for cases in which a single packet of data is revealed (including losing an encryption key). Our goal is to make a new network data security system with high-security insurance. A traditional network data security system (e.g., network-attached storage devices or a storage area network) achieves high security through encryption/decryption algorithms such as advanced encryption standard (AES), data encryption standard (DES), RC6, and other symmetric key crypt

Bilgi Sisteminin Yazılım Yetenek Olgunluk Modeli ile İlişkisi

PACE yazılımı Java 2 Enterprise Edition (J2EE) teknolojisinin N-katmanlı olarak uygulanması ile ortaya çıkarılmıştır. Yazılım platform, uygulama sunucusu ve veritabanı bağımsız olarak geliştirilmiştir. CMM (Seviye 3) PACE'in desteklemeyi amaçladığı temel standarttır. Bunun nedeni, CMM modelinin dünya çapında yoğun olarak benimsenmesidir. Buna ek olarak, Yazılım Mühendisliği Enstitüsü (SEI), CMM seviye 3'ü organizasyondaki yazılım mühendisliği ve yönetim süreçlerinin tüm projeler bazında etkin bir şekilde kurumsallaşmasını sağlayan altyapıyı sunan seviye olarak tanımlamaktadır CMM'in her seviyesi Anahtar Süreç Alanlarına (Key Process Areas - KPA) ayrılmıştır. SEI bu alanları, yerine getirildiğinde organizasyonun süreç yeteneğini tesis etmesi açısından önem teşkil eden belirli bir grup hedefin gerçekleştirilmesini sağlayan faaliyetler kümesi olarak tanımlamaktadır  PACE yazılımının CMM seviye 2 ve 3'ün ilgili Anahtar Süreç Alanlarından herbirini ne şekilde ele a

Cloud Computing Reference Architecture: An Overview

The Conceptual Reference Model Figure 1 presents an overview of the NIST cloud computing reference architecture, which identifies the major actors, their activities and functions in cloud computing. The diagram depicts a generic high-level architecture and is intended to facilitate the understanding of the requirements, uses, characteristics and standards of cloud computing. As shown in Figure 1, the NIST cloud computing reference architecture defines five major actors: cloud consumer, cloud provider, cloud carrier, cloud auditor and cloud broker. Each actor is an entity (a person or an organization) that participates in a transaction or process and/or performs tasks in cloud computing. Table 1 briefly lists the actors defined in the NIST cloud computing reference architecture. The general activities of the actors are discussed in the remainder of this section, while the details of the architectural elements are discussed in Section 3. Figure 2 illustrates the intera

Getting Performance Right

Just having a faster computer isn’t enough to ensure the right level of performance to handle big data. You need to be able to distribute components of your big data service across a series of nodes. See Figure 1 . In distributed computing, a node is an element contained within a cluster of systems or within a rack. A node typically includes CPU, memory, and some kind of disk. However, a node can also be a blade CPU and memory that rely on nearby storage within a rack. Within a big data environment, these nodes are typically clustered together to provide scale. For example, you might start out with a big data analysis and continue to add more data sources. To accommodate the growth, an organization simply adds more nodes into a cluster so that it can scale out to accommodate growing requirements. However, it isn’t enough to simply expand the number of nodes in the cluster. Rather, it is important to be able to send part of the big data analysis to different physical environment

Ten Big Data Best Practices

While we are at an early stage in the evolution of big data, it is never too early to get started with good practices so that you can leverage what you are learning and the experience you are gaining. As with every important emerging technology, it is important to understand why you need to leverage the technology and have a concrete plan in place. In this chapter, we provide you with the top-ten best practices you need to understand as you begin the journey to manage big data. Understand Your Goals Many organizations start their big data journey by experimenting with a single project that might provide some concrete benefit. By selecting a project, you have the freedom of testing without risking capital expenditures. However, if all you end up doing is a series of one-off projects, you will likely not have a good plan in place when you begin to understand the value of leveraging big data in the company. Therefore, after you conclude some experiments and have a good initial unde

What is data mining?

Data mining involves exploring and analyzing large amounts of data to find patterns in that data. The techniques came out of the fields of statistics and artificial intelligence (AI), with a bit of database management thrown into the mix. Generally, the goal of the data mining is either classification or prediction. In classification, the idea is to sort data into groups. For example, a marketer might be interested in the characteristics of those who responded versus who didn’t respond to a promotion. These are two classes. In prediction, the idea is to predict the value of a continuous (that is, nondiscrete) variable. For example, a marketer might be interested in predicting those who will respond to a promotion. Typical algorithms used in data mining include the following: ✓ Classification trees:  A popular datamining technique that is used to classify a dependent categorical variable based on measurements of one or more predictor variables. The result is a tree wit

Big Data Analytics Solutions

A number of vendors on the market today support big data solutions. Here is a listing of a few solutions that you may find interesting: ✓ IBM (www.ibm.com) is taking an enterprise approach to big data and integrating across the platform including embedding/bundling its analytics. Its products include a warehouse (InfoSphere warehouse) that has its own built-in data-mining and cubing capability. Its new PureData Systems (a packaging of advanced analytics technology into an integrated systems platform) includes many packaged analytical integrations. Its InfoSphere Streams product is tightly integrated with its Statistical Package for the Social Sciences (SPSS) statistical software to support real-time predictive analytics, including the capability to dynamically update models based on real-time data. It is bundling a limited-use license of Cognos Business Intelligence with its key big data platform capabilities (enterprise-class Hadoop, stream computing, and warehouse solution

BİG DATA IMPLEMENTATION

To get the most business value from big data, it needs to be integrated into your business processes. How can you take action based on your analysis of big data unless you can understand the results in context with your operational data? Differentiating your company as a result of making good business decisions depends on many factors. One factor that is becoming increasingly important is your capability to integrate internal and external data sources comprised of both traditional relational data and newer forms of unstructured data. While this may seem like a daunting task, the reality is that you probably already have a lot of experience with data integration. Don’t toss aside everything you have learned about delivering data as a trusted source to your organization. You will want to place a high priority on data quality as you move to make big data analytics actionable. However, to bring your big data environments and enterprise data environments together, you will n

Cloud computing riskleri

Öncelikle, bilişim teknolojilerinin bulunduğu ortamlarda riskin her zaman var olduğu ve yüzde yüz güvenliğin sözkonusu olmadığını hatırlatmak isteriz. Bu nedenle doğası gereği riskli olan bir alanda yapılması gereken, risklerin neler olabileceğini bilmek ve bunların olası etkilerini alınacak önlemlerle Kabul edilebilir bir düzeye düşürmektir.         Cloud computing daha çok getirdiği riskleriyle gündemdedir. Gerçeği söylemek gerekirse, halen gelişmekte olan bir süreç olduğundan geçiş sürecine özgü riskleri vardır. Ancak, şimdiye kadar riskler ve alınacak önlemleri değerlendiren değişik çalışmalar yapılmış ve Cloud computing’e geçişin güvenli şekilde yapılması için rehberler hazırlanmıştır. Bu çalışmalar kabul edilecek standartlara ulaşana kadar da devam edecek gibi görünmektedir. Bu risklerin bazılarının incelenmesi gereklidir.         Belirtilen risklerin en önemlilerinden birisi, kurumun hizmet aldığı firmaya bağımlı hale gelmesidir. Kurumlar bilişim hizmetlerini dış
Data Center Trends Shift Staff WorkloadsData centers are becoming lean, efficient strategic assets as they adopt cloud computing, XaaS, self-provisioning models, colocation, and other still-emerging technologies. Achieving the promise of these technologies, however, requires changing work assignments and updating skill sets. “These trends are redefining the data center work environment by reducing the number of physical devices that need human intervention,” says Colin Lacey, vice president of Data Center Transformation Services & Solutions   at Unisys. “This elevates the required skill sets from ‘racking and stacking’ to administering tools and automation.” While some hands-on work will always be required, it’s much less in highly automated or outsourced data centers. Tasks Shift Removing lower levels of work does free employees to focus on strategic business priorities, but it also establishes new tasks that didn’t previously exist. As Lacy explains, “Those new tas