Luke Lonergan - BIG DATA AND CLOUD COMPUTING
Luke Lonergan is a businessman who takes great pride in what he does. When he isn’t running his business, he loves to study history and firmly believes in the famous quote: “Those who cannot remember the past are condemned to repeat it.
Wednesday, 12 April 2017
Saturday, 25 March 2017
The Role Big Data Plays in Enterprise Collaboration - Luke Lonergan
It is important for every organization, and,
indeed, every project to be able to manage data volumes that can be explosive. Tools
of purging and archiving that are efficient are needed to deal with increasing
volumes of data. However, when it comes to collaboration, the huge amounts of
intelligence that are available from multiple sources need to be leveraged.
Consider any large project. It involves not
only documentation that runs into millions but also several thousand people.
More significantly, it involves a large volume of intelligence that includes
internal communications, processes and raw information. Thanks to web-based
technology and new mobile innovations that have to do with collaboration, data
that can be shared with members of the project regardless of where they are is
being captured easily. The overall intelligence aspect of any company depends
upon its data.
When it comes to Big Data, new information is
always flowing in, which means that new processes constantly need to be
developed to deal with it. This leads to a bigger learning curve with each new
process. All of this information, after all, makes up the intelligence of any
organization and, if utilized effectively, can lead to accelerated project
schedules, better project quality and more profitability.
Features
of an Ideal Collaboration
There are several aspects that need to be
looked into when it comes to collaboration such as safety, security,
neutrality, central location, limitlessness and easily searchable.
One of the first things that come to mind is
safety and security. The data that has been captured should not be accessible
to unauthorized parties and should be protected from damage or alteration. A
transparent audit should be done for all decisions and actions to support and
secure the data.
Ensuring that Big Data is neutral means that
there aren’t any ‘super-users’ controlling the access to the project data.
Everyone who belongs to the project should have equal access to the data.
A service that is active and is the nexus point
for all the project intelligence means that the Big Data is centralized. A
firewall should surround the service so that external project members have
limited access but authorized members would be allowed to exchange the data.
When it comes to uploading the data, there
should be no limits on the amount of data, usage, number of files or any other
factor. However, the data will need to be flexible and scalable so that it can
be managed with total transparency.
If the data is searchable, one can search the
repository of the data using various criteria. The repository should also be
able to go through masses of data from various sources.
Luke Lonergan is a resident of San Carlos, CA, United States. He has completed his post graduate degree from Stanford University. He is a founder of Didera, a database clustering company. In 2000 he served as CEO and Chairman. Luke Lonergan’s background includes 16 years of management experience in computing technology ranging from innovations in big data, cloud computing to advances in medical imaging systems.
Thursday, 2 February 2017
Strata 2012: Luke Lonergan, "5 Big Questions about Big Data"
How are businesses using big data to connect with their customers, deliver new products or services faster and create a competitive advantage? Luke Lonergan, co-founder & CTO, Greenplum, a division of EMC, gives insight into the changing nature of customer intimacy and how the technologies and techniques around big data analysis provide business advantage in today's social, mobile environment -- and why it is imperative to adopt a big data analytics strategy.
Friday, 27 January 2017
Cloud computing and the related abstractions - Luke Lonergan
The
buzzword cloud computing still is considered a mystic feat in the field of
Information Technology. What lies beneath this buzzword needs a high-end
research on this topic. The abstractions in cloud computing is a relevant word
to be analyzed.
Virtualization
is the mother of cloud computing. In general applications worked in two –tier
models and then evolved to three-tier models where the Interfaces, Data Models
and the Framework of deployment are de coupled and worked in an efficient
manner. This approach lead to the success of easy expansion, deployment and
integration features for the application providers.
How
is Cloud Computing abstract?
Let’s
first consider a scenario. There is an academic business that runs different
schools at different locations with slight difference in its individual
operations. The IT department has the challenge of providing centralized access
to information about all these campuses at the same time via a dashboard. They
also plan to expand the business to a new territory where zero ground work has
been started.
Now
the kind of infrastructure that the system is going to deploy is not known, in
the traditional environment, each campus runs an individual server system to
maintain the data, and needs to be available to take centralized report on such
data. To meet out all these challenges, a scalable, reliable, consistent,
secured, networked infrastructure deployment and maintenance is essential.
Instead of deploying a full-fledged physical network, the organization resolves
to buy infrastructure services through the cloud. It purchases storage space,
networking services, data hosting services, and the cloud based application to
run all the work seamlessly without hassles via the Internet.
The
business owner is not aware of where exactly the storage space is located
physically but owns a subscription for the space till it is used. The business
owner also uses the application that connects all the campuses via a private
network to fetch data, process reports, save and modify data and delete if
needed in the cloud without the knowledge on where exactly the software stack
is deployed ( though it is traceable by the networking specialists).
Data
availability factor despite link failures remains highly abstract. For example,
if one of the business sites is down, an alternative link from a different
back-up deployment region or zone is made available to the business owner that
the change in the links of the sourcing site changes in the blink of an eye.
This kind of downtime management is not only abstract but also very business
friendly.
Thus,
cloud computing does work miraculously by keeping many essential details on the
outward less messy and abstract so that the business for which it is deployed
is run with smooth edges.
Article Published by Luke Lonergan
Article Published by Luke Lonergan
Wednesday, 11 January 2017
Proper Data Analysis Can Help Small Scale Enterprises - Luke Lonergan
In today’s world, the word “data” has been a staple in every other conversation that you tend to hear. Our love of technology and the web has allowed a rich amount of data to reach a global scale. Everything that we do creates scores of data that can be used for analysis. With the strains of social media and an increasingly technologically aware market, it has become part and parcel of strategic research for many companies.
Subscribe to:
Posts (Atom)