Thursday, August 3, 2017

Investor versus Customer


The prosumers envisioned by Alvin Toffler are informed and conscious customers acting on a saturated market, their opinions being very important in decision-making regarding product design matters.

Toffler's analysis has been so snappy, that the prosumers have been the most powerful influencers of the software industry from its very beginning. The end-user comments are handled as valuable resources, from their opinions about "proof of concept" apps to their contiguos feed-back.

The prosumers are our ultimate investors, since by paying our services they are funding the next versions of their software tools.

In order to obtain funds for a brand new project we need a couple of key investors, usually friends and/or business partners, a bank, eventually we may try crowdsourcing. The latter supposes starting a fundraising campaign, which can be considered a stand-alone project due to its considerable resource needs and risks (building a stable community of supporters and spreading the word need time and a lot of skilled work).

And here come in the big questions: what kind of investors we really need, how to find them, and how to keep them without compromising our project?

When opting for crowdfunding, we are committing oneself to the expectations of a more or less numerous community, and in optimal case we transform them in our prosumers.

When opting for a small group of investors we need to analyze carefully their personalities and histories, otherwise at some point our project could be hijacked, or it could degrade in a battlefield for sole competitors with big egos.

When shaking hands with investors one need to consider the same risks as when onboarding employees, but when the collaboration does not work, in general it's many more painful, complicated and time consuming to change things.

In case of IT&C projects the experience and skill set of an investor may be also a delicate subject matter. As a junior analyst I've been surrounded by tech guys and I've been missing the perspective of business guys. Later as a semi senior analyst I've been missing the perspective of tech guys mastering certain technologies I haven't had access to.

When defining my own project idea, I've been thinking for a while about finding investors, who don't interfere at all in the decisions. After some time I've realized, that all the efforts put in finding investors with a predefined profile are vaste of resources, and the campaigns don't build me neither a personality nor a history.

Mature investors and project owners are looking for similar human values, and the best thing one can do for attracting and keeping the right investors is focusing on our to-do and personal development.


Wednesday, July 19, 2017

Where Is the Money?


In 2002, somewhere in East Europe I was watching TV, and a young guy was walking in a local studio for being interviewed about the future of the Internet. Our guy has been presented as an expert in the subject matter, and with all the self-confidence of his age group he has explained that the big money won't be in the public-facing Internet anymore, but in solutions for private networks.

I've felt stunned, because his reasoning has opened me a window to a different world. A couple of years ago I've felt in the same way when starting to work on a project for an ISO 9001 company - those people are living in a universe where it's natural to think in the long run.

But in 2002 the Internet penetration in East-Europe was low, and the dot-com bubble hasn't affected us. The expert guy was building his reasoning on facts like the high costs of securing virtual places, and the fundamental human need of safety.

And he was right.

During the next decade we've gotten mobile devices, cheap broadband Internet, social networks, cloud solutions, and IoT has teamed up with Big Data. After experiencing the advent of Web 2.0 - the world of interactive, rapid Internet applications - we've gotten wrapped in our personalized information bubbles shaped by Web 3.0 tools, and we've learned that security, reliability and usability are not for free.

Nowadays there is a killer competition between digital agencies serving the need for public-facing company websites. On Freelancer and similar sites it's simple to find the job offers of such agencies operating on the soho market: they are looking for students from non-industrialized countries, who accept to work for $1-$2 per hour.

On the other side of the IT&C service palette are the consulting companies specializing in complete cloud solutions: hardware & software selection, tuning, monitoring and scripting. These shops are relying on certified domestic employees, and represent a safe option for the middle market.



Thursday, July 13, 2017

Tech Guy and Business Guy

They are a complementary team by design, so to speak. The tech guy spots the essence of one's painful problems and elaborates solutions for them. The business guy is doing the marketing and the feasibility analysis.

Then why these two guys so often cannot get along?

Over the years I've seen many different situations: young or even mature people suffering from lack of education and experience, or too much greed. I consider that our personalities are evolving during all our life, and everybody has the capacity to overcome his/her weaknesses, but for turning a project into a success all the team members need to be in their best psychical shape.

Some people are nice one-man business guys - even if apparently they are leading thriving small businesses, all their team members are just executing orders. Other people are bringing the sun with them and turning their competitors in team assets - these special guys sometimes make millions, but most of the time just make happy their families and friends.

About a year ago I had a written job interview, and between others I had to answer this question: "What 2 'unicorn' companies do you consider most over-inflated and subsequently most susceptible to being impacted by a bubble burst?"

My reply stated the following: "...Uber and similar dispatcher companies, picture sharing, text and voice chatting, social networks (SnapChat, Pinterest) might loose even more than 50% of their current  market values."

Of course somebody else has taken the job, because the job poster (like many many others) has underestimated the importance of the technological factors in the evolution of business entities.

Those guys bringing the sun with them are natural leaders, they are able to run companies with happy tech guys and business guys, and I'm considering a matter of elementary common sense to vote for them.



Sunday, July 2, 2017

Why I'm After OneDrive

In short: because of SharePoint.

Several years ago the Dropbox desktop client has been running on my computer, and I could interchange documents for free, without accessing the browser interface of this popular cloud-based service. Then I've uninstalled the desktop client, and I've continued using Dropbox as a personal file sharing service, because my clients preferred Google Drive or OneDrive for teamwork.

Then I've learned: "There's no such thing as a free lunch". Technically it's easier to extend the existing data centers with server and data storage resources, than increasing the throughput of the networks for accessing those servers - consequently the companies offering cloud-based services are still offering for each subscriber a limited amount of free cloud storage, but the efficiency of using that free storage is highly variable.

The stick has always to ends, so if my file uploading or downloading experience is poor, and my cloud-based file sharing service becomes unresponsive or is throwing weird error messages, the problem could be at their end or at my end.

For example my local ISP has to deal with high traffic during our evening hours, and I've learned that uploading big files (more MBs each) does take many more time during peak hours. After getting confused when Dropbox has failed to upload my file (less than 1 MB) during peak hours, I've learned that I need to double-check if my files have been uploaded successfully into Dropbox, and if not, I have to repeat the operation after 30-60 minutes or so.

After getting the feeling of how things work with an Office 365 Enterprise subscription, I've understood the difference between free and paid file sharing, and I understand those people, who just don't want to look back.

Beyond the fact that one's files are always getting when and where they need to be, more persons can losslessly update different sections of the same file in the same time, and a refined system of access rights becomes available: for example deleting, forwarding or downloading a file can be permitted or denied.

Small companies and/or teams have various cloud-based file sharing alternatives to choose from, and testing more of them is the best way to find out, which would work better for their needs.

The free OneDrive service has the advantage, that after upgrading to a commercial package, SharePoint can be plugged in and configured to suit higher security and heavier file sharing needs.


Sunday, June 18, 2017

Do-It-Yourself Apps


There are countless tools out there for building mobile apps, from frameworks designed for developers to SaaS solutions targeting non-technical users.

While "cross-platform development" is a commonly abused term referring to designing an app and building its corresponding app packages for various devices, "do-it-yourself app" is an even more confusing denomination.

For a teenager, who already knows some html, JavaScript is a natural option for getting started with the basic programming concepts, such as memory variables, conditional statements or loops. Using a basic framework for packaging his/her first bytes, and running it on a mobile device can be incredibly motivating for digging deeper in the IT world.

For a student, who already knows about servers, document sharing, databases and server-side scripting, putting together a mobile app for employing those server resources is going to be his/her first architecting experience.

For a non-technical person, who handles documents via Office, SharePoint or Google Drive, using an online wizard for generating a mobile app could be a positive initial experience, followed by various scenarios down the road.

If a team of non-techie persons is using the services of an IT&C company, who is taking care of hosting their documents and data, and is also offering the possibility of generating apps for accessing that data, then good chances are that things will keep working as expected.

If our team of non-techie persons is using a SaaS solution including an online app generating wizard and a middleware for wiring together the generated app with data in a database hosted by a third-party, then everything can happen.

In case their database consists of Excel or similar workbooks on Google Drive, OneDrive, DropBox etc., the scenario is plain file sharing, which works ok while those files are set as read-only, or are accessed by one person at a time for writing.

In case their database is a SQL engine, they will certainly need an IT guy for setting up the connection between the database and the middleware, and advising about the DOs & DON'Ts of the database engine they have.

In both cases the response time will always be very slow, because the middleware is accessing a third-party system instead of a data store situated within the same data center.

In my opinion the middleware itself represents the weak chain of a SaaS solution offering access to a wide variety of data stores and/or file formats.

Since the difficulties caused by migrating data between dBase, Access, MySQL, MS SQL Server etc. tables, there is known that there are no shortcuts, no "one size fits all" automations for working with data. The more types of storages are handled by a software, the more bigger the software grows, and the more error conditions occur.

In other words, the more types of files a middleware is handling, the bigger and slower and more error prone is getting.

IMHO a SaaS solution generating apps might be suitable for small companies having low-medium risk business processes and sharing a couple of tables with limited write access, or for projects with urgent data sharing needs, until the project owner acquires a more customized solution.

Monday, June 12, 2017

Why I'm After SharePoint


Similarly to the conclusion of several commercials, my reply could be: "Because I deserve it". I consider that SharePoint is Microsoft's best bet after Office, a solution in which the user's needs of sharing a high diversity of files have met the integrity and traceability offered by employing a server process for safeguarding those files.

While "cloud" is a marketing wrapper for high standard hosting services rather than a particular technology, Sharepoint is a very specific integration based on Windows Server, IIS, Asp.Net, MS SQL Server and other Microsoft products, addressing the needs of medium and high risk business processes.

If you can organize your processes in a manner that people are sending to an assistant all their requests for updating the company's shared documents, then SharePoint might not be a necessity for you.

If your processes require only shared read access to files, and it's fine for you to establish strict rules in order to prevent the same file to be updated by more than one person during a couple of hours, then SharePoint might not be a necessity for you.

If you need the same document to be updated by more than one person during a given time interval, then SharePoint is a necessity for you.

Of course you can have your private SharePoint installation, but for a small company in most cases Office 365 is the way to start using SharePoint, and Office 365 is running in Azure.

 Azure is a winning cloud hosting solution for those who want an easy to manage, always up and running system, and it has multiple alternatives for those who consider more appropriate to combine bare metal cloud hosting with the services of an IT&C company assuring the configuration, tuning and good functioning of their system.

But again...why I'm that much after SharePoint? Because I've spent many years with various software products based on plain file sharing, and respectively SQL engines with plenty of threading issues ...and I'm tired of the many problems, complaints, disputes, claims, vaste of time and money resulting from misusing those software products.

Sunday, May 28, 2017

The BPO Saga


Accounting and debt collection services are classic examples of businesses answering the need for externalizing certain business processes. The first call centers have occurred in the 1960s, when phone systems became generally available consumer goods, and the first digital marketing services have occured in the 1990s, after the Internet.

Nowadays the BPO family includes members from multiple location contact centers to boutique accounting firms. Although different in geographic presence and organigrams, all of then are sharing a common set of particularities:
  • operating in a highly competitive market environment demanding always better quality and lower prices
  • permanently looking for new opportunities, because both customers and employees are moving targets
  • skilled and experienced decision makers, because the industry is not attractive for investors and there are no funds for errors
Nowadays even a student can organize a small team for offering digital marketing services, or a home based phone answering service. He/she won't have contracts with banks or supermarket chains, but by investing much time and some money in his/her startup he/she can gain the necessary experience for being hired by a BPO company operating on the middle or high end market.

Once a tech guy enters a professional BPO company, he/she will experience the "love me or leave me" trait of this industry. While these companies must follow the newest trends in technology and economy, and they encourage their tech personnel in studying, getting certified, growing personally and professionally, high salaries and stable incomes are not typical in the domain.

BPO workers are mostly underpaid, and sometimes even overworked. This happens because of the complex business model based on numerous smaller and bigger projects handled in parallel by a number of persons working in turns. Micromanaging such processes is a nonsense - most workers have to do multitasking (handling at least 2 projects in the same time). A fixed full-time or part-time salary supplemented with bonuses is widely accepted for this kind of work.
 
Let's face it, staying 8 hours on the phone is not a dream job, and tackling Office documents for ever-changing customers is not something you are happy to do all your life. The BPO workers are looking for other jobs, and the tech guys might leave the boat for pecuniary reasons.

On the other hand most customers of BPO companies are entities running 1-2 year projects, or startups with limited funds. In time the projects are ending, and the startups are or failing, or growing and considering more appropriate solutions for their new needs.

For BPO managers hunting new customers and new personnel has to be a permanent activity. For them the breach between contractual incomes and salary expenses does mean first of all an investment fund in technology and education for keeping the boat floating.

The funny boys of the BPO family are Upwork and Freelancer. Both are defining themselves as meeting places for employers and professionals, but they are used mainly for BPO purposes and for experimental projects.

While Upwork has been struggling for two years to break into a hypotetical middle market, Freelancer is adapting itself to a global market using the breach between the prices and salaries of different countries. In my opinion who is consuming the benefits of this breach for anything else than the latest technologies and education, will probably fail in the BPO business.

Contact centers already have a tradition in moving from one country to another, virtual admin and digital marketing teams are following their paths as the Internet bandwidth becomes larger.

I'm envisioning BPO as a saga because for surviving they have to fight continuously with the project management triangle's constraints and doing their best quality work.

Thursday, May 25, 2017

PowerShell as Rescue Party


During the last two years Microsoft reorganized its own certification system, and now MVA is offering several so-called Learning Paths, two of them based on PowerShell. Although it's categorized as a tool for IT professionals, I consider PowerShell a fundamental tool for developers as well.

As most end-users are working in small and medium-sized companies, most IT guys are expected to offer working solutions, not just completing isolated tasks.

Thinking about solutions determines us to analyze and define the problem we have, to shortlist possible alternatives, to consider the pros, cons, HR and financial aspects of those alternatives - in short it teaches us project management.

The nature of end-user activities and workflows indicate when and where a web app, a shell script or a desktop app works better.

PowerShell always comes in handy as a rescue party when an emergency situation arises, or concrete results are required by undecided users, or the work environment is chaotic.

This happens because PowerShell has been designed to be the IT guy's Swiss Army knife. Of course PowerShell has evolved together with the Windows OS versions, and it's neither a universal glue for 32 bit and 64 bit software, nor a scotch tape for parallel and batch processing.

When I've made my first home work with PowerShell and MS SQL Server, I've been impressed that my small script was reading csv (xml etc.) files pushed by a different system, and updating my database, and sending notifications in case of certain error conditions.

Later I've learned that SharePoint workflow automations and remote server monitoring are also using PowerShell - so it certainly worth the time and money spent.

Business models are changing in the same pace as the economic environment, consequently nowadays in an IT system the proportion of the rapidly updatable shell scripts tends to grow - that's why learning some PowerShell becomes more and more important.


Saturday, May 20, 2017

Synching Calendars?


Innate multitaskers and firefighters are thriving in a chaotic environment. They don't mind to handle 2-3 phones, half a dozen of IM channels and a browser with 1-2 dozens of opened tabs.

Easygoing professionals might pick up several messaging and time management software, and one day they realize that they have a number of calendars, contact lists and message lists to sync, because the data they need is spread across more or less integrable services.

Is hiring a part-time assistant a good solution to this kind of new problems produced by the new technologies? The answer is - as always - it depends.

The primary rule is that the business model tells what kind of project management and process management tools are suitable for the company. People often put off changing software, or consider that their existing processes are too complicated for being restructured.

"We love our ...name here... software so much, that we've also purchased ...here comes a list of tools... in order to complete its missing features" could be a winning innovation model, or hard stuff with steep learning curve for new employees, or a hack causing quality degradation in time.

In case of a BPO company it's good to search for a software based on Kanban tables or Gantt charts, which offers graphs for past and future time intervals. Todoist is ok for a freelance VA, but not for a VA team, and who has dozens of processes, might consider migrating from Asana.

In case a professional needs a predictable and quiet environment for performing well, then he/she will definitely make a good complementary team with a native multitasker assistant, regardless of his/her software tools.



Thursday, March 2, 2017

From Blogs to Content Management

Lately I've spent quite a bit of time with Moodle, Office 365, and a few hours with WordPress in order to check some embedding tricks.

To be honest, I've never been a WordPress fan ...between others I'm not after its architecture, and I don't want to use custom markup instead of html.

As a wannabe e-Learning course author I've been pleasantly surprised by the good integration between Moodle and Docs.com, or even a sway including a classic presentation with an embedded Youtube video - practically a teacher without IT background can attend dozens of students free of charge.

The forms and quizzes provided by Office 365 could be used as feedback surveys, but Moodle's scales, gradebooks, question banks, assignments, workshops and other specific entities are complementing excellently with a general-purpose document authoring and a multimedia authoring software package.

Nowadays a teacher can feel overwhelmed by the multitude of software tools designed to make his/her activity more efficient, and an administrative assistant might also need half a dozen of CMSs for doing his/her job.

Stock management, billing & drop-shipping, website & SEO, team management & collaboration are the minimal toolset for a company doing online commerce, and the situation is not simpler in other industries.

With the advent of management systems today we have hundreds of online tools offering highly interactive, user-friendly solutions, and hiding as much as possible the complexities of their internal logics.

In such an environment I think there is not a good idea investing time and money in CMS systems, which require mastering a multitude of static templates (a template should be visually editable and self-explanatory) or big manuals describing dozens of counterintuitive workflows instead of visual, context-sensitive guides for novice users.



Tuesday, December 13, 2016

PM Tools


Think about a restaurant where people can choose what they want from the menu. How would you organize 8 workers in the kitchen to prepare 27 meals in the shortest time? Project management tools are meant to support resolving this kind of problems.

A company can have 2-3 projects...or dozens of projects to work on at the same time. Each project can be taken off in processes (some of them must be done sequentially, others can be done in parallel), and each process can be taken off in activities.

Both processes and activities can be prioritized and re-prioritized as needed.

Gantt charts are classic tools used for a reduced number of slowly moving projects with well-known outcome - in construction you have the design and you can do good estimates for processes and activities.

Kanban tables are newer tools used in agile project management, and they are good for visualizing processes of projects with many unknown aspects - in service-based companies you have to redistribute workers and machines in function of the incoming commands and technical incidents; the IT&C industry is dealing with the most rapidly changing aspects like traffic on the wire/air and user requests.

There are many dozens of PM tools out here, I'm going to list three of them.

Trello for 2-3 small projects: https://trello.com/

KanbanTool for bigger projects - it has "swimlanes" (projects and processes can be kept together and handled easily), "sub-tables" (for complex projects) and in general it scales well as a company is growing: http://kanbantool.com/

Microsoft's Project - although I've never evaluated the costs for adopting Office 365 with additional goodies: https://products.office.com/en-us/project/compare-microsoft-project-management-software

Thursday, July 28, 2016

Dotnet Core On Duty


Several hosting providers have added ASP.NET Core 1.0 to their offer. In other words after near four years  of client-side presence .NET Core is now on duty at server-side.

The hardware and communication technologies emerging after the first  version of the .NET Framework have made necessary a major redesign of Microsoft's managed code environment, and finally .NET Core is ready to go for early adopters.

Some people testing or evaluating ASP.NET Core's feature set are wondering why it doesn't include a mailer, an image processing library, a DataAdapter or SignalR implementation?

In my opinion this happens because it has been designed as a modern multi-platform tool with loosely coupled architecture and dockers in mind, employing with maximum efficiency the appropriate platform-specific software resources.

The server-side operating system of your choice already has native tools for mailing, charting, generating images, rich text, data sheets or handling multimedia files, and those tools will certainly work with better speed and stability than a generic library.

ASP.NET as a middleware does not need to double the role of a web server, a game server or a media streamer, those roles are normally delegated to specialized local processes - their concrete pros and cons depending on the server operating system or your third-party vendors.

From project management point of view defining clearly the objectives and domain of relevance are key aspects of a successful project, thus in my opinion ASP.NET Core is on the right track and it's a good choice for new projects with service-oriented architecture.


Friday, July 22, 2016

Multi-platform Development Now and Then


Between 2002 and 2004 I was a brave full-time enterprise employee preparing myself for a freelance career. I've spent some time playing with html, JavaScript, Java, PHP, C++ and several Linux distros.

That time developing cross-platform code has been not only a shiny perspective, but a common user requirement, and getting one's Windows applications running on Wine was a cool feature.

When I've had the time to look into the core of a free multi-platform programming language framework written in ANSI C (like PHP or Python), I've realized the dimensions of the human resource investment and dedication necessary to develop and maintain such products, which predestinates them to be an oligopoly in terms of market structure.

Between 2005 and 2008 it was still a good decision to invest in multi-platfom applications - that time the libraries based on managed code  were not mature enough for serving efficiently a considerable list of market demands, typically coming from domains where the business  processes have been changing rapidly.

In the meanwhile the spread of multicore processors and broadband Internet services have made possible the spread of new programming patterns focused on better server response times and more responsive user interfaces with rich, internationalized content - multiple challenges asking for refactoring classic libraries.

During the previous decade the hardware industry has evolved more rapidly than the software for the new technologies, that's why mobile operating systems like Symbian or now Android could achieve so large popularity - it has been necessary to put something on the new hardware to get it working and doing sellings.

Due to the continuous diversification of processors and hardware architectures an increasing number of software companies have started choosing Java or .NET for their long-term projects.

For a small or medium software company doing cross-platform coding is not a financially feasible option anymore, and the managed code is employed between others for execution speed improvements.

The development efforts invested in .NET and targeting parallel and asynchronous programming led to a solid foundation for business-critical apps doable within the limitations of a concrete triangle of budget, time and quality.

.NET is my world, with Java I'm not familiar enough to write about its recent evolution.

Thursday, July 21, 2016

Successful Software Projects?


Reid Hoffman's opinion about early user testing is now taught in courses: “If you aren’t embarrassed by the first version of your product, you shipped too late.” This happens because nowadays prototypes are used for presenting brand new product ideas.

Prototypes are working drafts, similar to mockups used by architects. The look and feel of a software prototype is pretty much the same as of a software product, people can get their hands on it, although most of its functionalities are missing or being replaced by static content (input forms, reports etc).

Prototyping is not a new software development methodology, you can find 20-30 year old custom utilities started as throw-away prototypes and then kept in production forcibly by their fans, regardless of the costs implied by such decisions.

Unfortunately it happens frequently that the investors are behaving such like the above mentioned fans and they are pushing early prototypes in public zone instead of targeting a limited and knowledgeabale audience of alpha testers.

Once an early prototype gets to the free and highly competitive market, the feedback coming from informed users will certainly not be positive, even if the product idea itself would have been great.

The toolset used for preparing a prototype is also a key factor.
Currently there is a big number of scripts, libraries, frameworks and database engines, which are suitable for rapid prototyping and minimizing the costs for getting a prototype out of the door ("fail early and fail often" applied in the context of a product portfolio).

The problem with most such tools is that they might have serious limitations regarding aspects like scalability, employing cloud technologies, internationalization, extensibility or integration with other software.

In case a low-cost prototype gains popularity and investors, sooner or later the original code and tools have to be replaced, which usually implies changing teams, a decision which might be fatal for the future of a product.

After all, software are extension modules to hardware and due to the market pressure continuous change management is needed for keeping a software product attractive for users.

The key figure of successful software projects has always been a manager receiving sufficient respect, trust and funds for selecting the right resources for doing the right thing in the right time.

Personally I'm after engineer-led startups, a sane new trend.

Workbook Versus Database


The first step towards a database is usually one's first Excel sheet. In time one's sheets get collected in workbooks, then shared within a team or distributed to a number of persons.

Finally when the mess runs out of control, some IT guys do save the business by creating a database and processes for updating it.

Most of the time users spend a number of years with their growing sheets, and sometimes they are investing in additional hardware for being able to continue using some "cool" sheet, which includes hundred thousands of formulas.

Excel is employing so-called "tight loops" for getting through the calculations as quickly as possible, and these tight loops are big resource consumers.

Sooner or later a sloppy sheet with too many formulas is going to challenge too much the operating system's resource management capabilities. In other words workbooks are not scalable data containers.

Theoretically it's possible to remove all the formulas from a sheet and to use VBA or other scripts for calculations.

In VBA tight loops can be mitigated by enforcing them to output some value in a cell from time to time (external interrupts give way to the garbage collector to do its job).

The real problem is that beyond a certain level of complexity implementing one's business logic in VBA (or other script) and workbooks would lead to more expensive and less reliable software than opting for a scalable database solution. 

Free Source As Public Library?


In 60 years IT evolved from a subject matter for scientific research into manufacture and then industry with its own design, management and quality standards. 

Now there are practically three generations of specialists activating in different organizations, who have assisted these major shifts and have had to drop the “previous” technology and business model for finding their way with the “current” one. 

20-25 years ago the software development was still manufacturing-based, and the market was so hungry of new products, that the users have accepted testing alpha-stage free software packages in their own time.

Then in a decade the global spread of the Internet has put pressure on developers to invest effort in architecting loosely coupled solutions rather than growing monolithic desktop applications.

The investments coming from commercial companies have helped the software business to evolve from manufacture into industry, and the interoperability needs have been beneficial for standardization.

10-15 years ago the software market saturation and the economic slowdown have affected numerous software companies.

Most companies have adopted free software usage and outsourcing as collaboration models in order to consolidate their businesses and partnerships. During this adaptation process the limitations of the GPL license became evident, and other licensing models (MIT, Apache) became more popular.

As the life-cycle of a software product is of 3-5 years, the current market offer includes a considerable number of applications employing many modules designed with earlier hardware architectures in mind.

On the other hand a server-side interpreter or a database engine with threading problems don’t play well in the cloud, and JavaScript is not suitable for everything needed at client-side.

Short-time investors might be right when opting for extending the life-cycle of a classic software product, but on the long run selecting carefully new tools and using them correctly will keep the boat floating. 

(2014 - 2016)

How Much Is Too Many?

When designing or updating a software product, one need to consider both human and technical factors. Whenever I'm referring to a form, the same goes for web pages.  

Some Biological Limits: 

Frequently auto-refreshing tables (each 10-20 seconds) and frequently opening or closing forms are very tiresome for the eyes.

The visual acuity is a spot (a spatial arc of 10/360 degrees), suitable to hold a text column for reading - in practice this is about 40 characters per line with the default medium font (when one's eyeballs are enforced to move horizontally, the text line is too long).

Some Psychological Limits:

When a document's background image distracts the attention (intensive colors, crowded patterns, reduced contrast between background and text), the users get tired in short time.

Our distributive attention can manage up to 6-7 things, consequently a form should not have more then 6-7 groups of controls (menus, tabs or groups) and a group should not have more then 6-7 controls (except labels).

When this limit is exceeded, the user is feeling overwhelmed and frustrated by the user interface.

The Business Logic:

The quantity and frequency of the data exchanges with other computers decides what networking solution we need.

The type of database engine, software and hardware we need depends on the number and contents of documents, archives and processing requirements.

Resource Management:

A Windows application is a collection of modules (executables and libraries), which can be shared by multiple users across a network.

Our local copy needs to accomodate with our local hardware resources, and it should behave decently when consuming network resources or accessing remote servers.

Forms and Controls:

When designing the user interface it's important to keep a good balance between user requirements and technical limitations. 

Considering the answers to the below questions will help in structuring the user interface.

How many forms can I have in a project? Maybe 2, 4 or 10 - it depends on how many controls contain in total.

Each control is a window using memory, handling events, and it needs to be (re)painted from time to time. Consequently each control is a resource consumer of RAM, processor, and video capabilities.

How many controls can I have on a form? Counting all the object tree items for a form can help us evaluating our forms. If our form contains more then 50-100 controls, it's time to think about splitting it in two.

It's important to know, that using tabs is good for making our user interface tidy, but all the controls present on a tab are created "at once" with the parent form, regardless of their visibility.

The more powerful a machine is, its limits are the higher, but spreading controls over multiple forms will always be a necessity. 

(2011 - 2014)
 

Welcome!


About two years ago I've written my first blog entry in a private space, and now I'm feeling ready to go public. 

As an old school IT generalist I've made my way from custom-made proggies to standardized packages, and in the meanwhile I've discovered that in our industry the study materials are changing in less than a decade.

"Welcome to My World!", and your constructive criticism is more than welcome!