IT Firm

1 post / 0 new
#1

IT Firm

I was thinking about spinning up an IT company but it would probably be too much work vs. finance. To make selling easier and to stand out I would connect everything to Web 3.0 as followed:

1. Software Development

I would offer a simple Declarative, Functional, Reactive, context-Aware Information System that is fully-dynamic and agent curated for real-time feedback loops. I really like EnterpriseWeb and Profium.com at the Semantic Technology & Business Conference. It made me realize that the entire corporate structure of the enterprise is about to fall apart and that probably goes for governments too. Programmers will be able to collapse entire divisions of enterprises into rule sets and agents and then the jobs become redundant.

Declarative and functional programming language Vs. say a non-declarative, imperative programming language are mostly semantics for the old way of doing programming (small black boxes of programs) vs an open declarative and functional usually means rule based processing of data that can be expressed and processed via any processing model that can complete the transformation, i.e. context aware rules... So, if this system allows for declarative transformations into it's rules/agent logic, then it can handle any OWL the customer has to process. Of course not all OWL/RDF stuff has to be parsed, sometimes it just gets passed along/displayed as is... You only need to transform it/validate it/etc if you are using it as a processing queue of tasks/events or something like that.

This has lots of benefits. See: http://enterpriseweb.com/solutions/product-categories/

====

2. Web/MOBILE Development

We would only do decoupled architecture that enables computing components or layers to execute independently while still interfacing with each other. Every component has all the parts that it needs to live by its own, and communicates with other components by a predefined API. With such an architecture, the components/layers can be developed independently without having to wait for their dependencies to complete. This leads to pipelined development, resulting in more streamlined and faster development. This also improves testability of the components. Typically, decoupling the UI from its dependent backend brings more benefits and is often ignored. You can drive your web UI through well defined APIs that return JSON/XML/ATOM or whatever format convenient in your environment. With such APIs defined, the UI developers can use mock data during development instead of waiting for the backend developers to complete their tasks. Since the UI and backend developers produce their work in parallel, idle time and repeated integration can be minimized. Since UI developers are typically ahead in the development cycle, this helps backend developers in finalizing the data formats with minimum iterations. Decoupling also helps in replacing/evolving each side independent of other. For example, if you migrate your backend services from Java to Scala for whatever reasons, it has little impact on your UI and vise versa.

Applications built with such an architecture can correlate events across systems and/or devices and trigger actions and workflows, automating tasks and delivering a new era of efficiency that will transform modern life.

Of course today’s Web information is siloed by applications, programs and sites made to automate and monetize specific tasks. Your calendar is an application programmed to schedule meetings. Your work documents use a different application. Your playlist is somewhere else. Soccer team schedules are somewhere else. Then there are the shopping, airline, hotel, community service and multitudes of additional sites, all isolated from one another.

By publishing content to Concept Repositories, you remove the silos to create vast storehouses of structured, machine-readable information. These platforms replace or supplement text-based documents with concepts-based data, enabling machines to process knowledge, similar to human reasoning, obtaining more meaningful results. Devices then communicate with one another to perform tasks that currently require multiple searches and force the user to integrate all search results to perform the task.

development of “decoupled” architecture serves to support this shift by giving engineers more flexibility to create components that provide unique user experiences across their customer’s journey. In simplest terms, “decoupled” refers to the separation between the back-end of your website (your CMS) and the front-end (or many front-ends).

====

3. Software agents/Chatbots

In artificial intelligence, an intelligent agent (IA) is an autonomous entity which observes through input and acts upon an environment using actuators (i.e. it is an agent) and directs its activity towards achieving goals (i.e. it is "rational", as defined in economics. Intelligent agents may also learn or use knowledge to achieve their goals. They may be very simple or very complex. A reflex machine, such as a thermostat, is considered an example of an intelligent agent.

Simple reflex agent

Intelligent agents are often described schematically as an abstract functional system similar to a computer program. For this reason, intelligent agents are sometimes called abstract intelligent agents (AIA) to distinguish them from their real world implementations as computer systems, biological systems, or organizations

Agents are also colloquially known as bots, from robot.

A decoupled architecture will give rise to mobile software agents (virtual assistants) that can perform tasks, or services for an individual based on user input, location awareness, and the ability to access information from a variety of online sources like APIs.

4. Web-Crawlers/on-demand cognitive computing platform.

All this would be topped off by creating the Concept Web (SyNet), a new Web separable from the previous Web layer (Web 2.0) that is computable by machines. The Concept Web creates a distinct layer on top of the existing linked data layer with its own referencing scheme that can be resolved with the current URI scheme. This Web will have its own way to define and handle terms, concepts, relations, axioms and rules--the structural components of an ontology. Everything else should revolve around it, data population, ontology enrichment, subject indexing, searching, matching, sharing.

SyNet will integrate the data on the Web to create comprehensive, machine-readable Concept Repositories. The result: Web platforms that can think and act like humans, and that can do your work for you.

Yes, the current Web has put the world at our fingertips, but it hasn't done it with much efficiency. Making simple plans, personal or business, requires multiple searches, and forces the user to integrate the outcomes of those searches. That's because current search technology takes every request literally, and has no knowledge of concepts or actions. Additionally, most Web-based information is stored in databases that are created to be read by humans and “marked up” (think HTML) for basic, literal processing by machines. This places unnecessary limits on the extent to which the machines we depend on (computers, phones, tablets and the software that runs them) can process the data they find.

5. Miscellaneous

DEVOPS/GAME DEVELOPMENT