Tuesday, September 29, 2020

The Poor Man’s AI

Sometimes I have fun with checking the logs of my website. Over the years I’ve changed domains, hosting providers, and technologies, but certain script attacks keep coming in the exact same style. If I’m looking up a couple of IP numbers they are originating from, most of the time I’m finding them reported by admins. 

A few weeks ago I’ve discovered a new phenomenon: an IP reported and blacklisted by a bot. In other words a webmaster with passion for coding, and pissed off by WordPress hackers has invested some time and effort to automate filtering the access logs for common WP attack patterns, and blacklisting the originating IP numbers.

Taking into account that various hosting providers are scanning sites for discovering WP vulnerabilities, and the script kiddies might use IP rotator, it’s hard to tell that blacklisting a particular IP number is always a good idea.

For me the remarkable thing is that people are starting to teach their scripts to identify an unknown script  (bot, crawler, you name it) after its behavior. The first intelligent antiviruses adopting behavior analysis have lifted the software security to a new level, and in my opinion the main utility of AI is offering new possibilities as a tool.

Since the advent of online shopping, travel tickets, sports betting, trading, and property listing there are countless crawlers sent day by day, hour by hour or even more frequently to gather data. While the high-end and middle-market companies are already hiding their data sources behind paid APIs, and using refined AI solutions for blocking the undesired bots, the low-end markets are dependent on the cheaper and less sophisticated software automations.

A small family shop or local business are not losing money by serving a hundred of bot visits per day via a classic hosting package, but a regional online business hosted in the cloud usually has a big database and it's  paying a quantifiable price for the outgoing traffic generated by bots, and the slower response time of their servers might be noticed by their clients.

Many times the velocity of getting the latest data sets, or a specific projection of a big amount of data are the keys of the success of a business. Ultimately knowledge is power, thus for a commercial entity at some point it becomes profitable to erect a fence against bots. 

In a hi-tech country you can buy whatever data you need. In a non-hi-tech country it depends on the local culture, how much is the price-quality ratio of the data you can collect. In the grey area of partially digitized data AI may be used eventually to analyze sound tracks and videos in order to rate the protagonist's objectivity.


Thursday, September 17, 2020

Hiring Ceremonies

 The hiring strategies are culture and industry-dependent, even if the final decisions appear to be dominated by the daily mood or delusion of a manager.

According to unofficial estimates in Spain 60% - 80% of the jobs are taken by family members, friends, or their recommended acquaintances. Based on the discussions I've followed over the years in various social networks I believe that the estimate is representative to the neo-Latin communities - I mean more hundred millions of people living in a Romance-language speaking country, city or neighborhood.

People belonging to communities with English-Saxon origins seem to have more appetite for experimentation, and for instance in the USA or Germany a bigger proportion of the work opportunities are taken by newcomers (outsiders, new faces) than in a neo-Latin country. 

In fact the whole telecommuting movement and the specific PM methodologies are based on the Yankee approach of handling multicultural teams and organizations. Their respect of diversity is manifested in the multitude of project types thriving in virtual spaces. 

As a freelancer I've seen both small software boutiques and companies with siloed departments; team leaders making hiring choices based on textual answers to a couple of questions, and CEOs hiring HR experts, or running automated video interviews.

Many hi-tech companies are using multistep pre-screening interviews for hiring. Even if the records of those conversations are analyzed with AI tools, the entire process reminds me of an age-old movie scene, where a mature lady was interviewing jobless youngsters, and selecting the right candidates in minutes.

25 years ago I told my manager of that time that I was not afraid of the future, because the tendencies can be calculated, but I was afraid of the individuals who are unpredictable. Since then  I've learned to deal with the uncertainty, and to appreciate people for the good things they've done so far, not for the mistakes they might make in the future.




Friday, September 11, 2020

Free Source with Upselling?

It has never worked as a business model. Although you can find many companies using free source tools, none of them has made years in a row at least zero profit from selling paid add-ons targeted to enhance free source packages.

From a marketing point of view it sounds promising to label a service offer as based off of a community driven tool, which is meant to focus on the user’s real needs as opposed to an abstract enterprise leader board’s considerations. 


Presenting a business entity with employees making a living from and contributing to the free source ecosystem sounds like declaring the company as part of the sharing economy, and it may attract talent from the Z-generation, there’s nothing wrong with that.


The real problem is with those guys confusing the marketing tactics with a business strategy, and funding startups insisting on such a strategy. It’s never a pleasure to work for a service condemned to fail - it’s a high-stress environment, where the initial success moments are followed by an endless loop of “whatever I do it’s the wrong thing to do” moments.


The history of long and respectively short-lived free source projects demonstrates that the need for paid software add-ons is scarce and/or random, and the revenues collected from selling would eventually cover the spending with hardware and hosting for upkeeping such activities.


This happens because most people and companies using free tools don’t have sufficient resources to pay for the commercial alternatives of those tools, or cannot keep their offers competitive on their markets in the eventuality of using commercial software tools, consequently they are focused on avoiding operative spending as much as possible.


The long-lived free source projects are backed by strategic users, or are funded as side projects by financially stable organizations.



Thursday, April 23, 2020

Copy-pasted Business Plans


They are coming in all shapes and sizes, from free brochures for getting rich quickly to fill-in-the-blank templates distributed by banks for getting a business credit.

The mirage of living a good life or even having a luxury lifestyle based on passive incomes is always tempting, but very few are getting there and for a very short time. Receiving millions of clicks for an article or a video is about haphazard, and securing a contract assuring a 7 or 8 figure income is always coming with a cost you are ready to pay or not.

I'm not discouraging anybody from doing investments without domain knowledge or playing the lottery, but I consider these two similar in their essence, and I'd allocate only small amounts of excess money for such experimentations.

While experimentation is considered a fundamental technique for implementing the business idea of a lean start-up, it's not easy to be done right, as it shows the big percentage of failed new companies.

Over the years I've seen a number of very different approaches to develop a new business, and very different chains of decisions. The common root cause of the failures I've seen so far has been the improperly elaborated and/or applied business plan, by following the patterns learned from already established organizations.

If I were in partnership with all those people I've seen giving up on their business ideas, now I'd be an experienced investor. As a freelancer I can only offer an opinionated piece of software for assisting to-be product owners in their financial decisions, available at: https://www.microsoft.com/store/apps/9NBLGGH52LJC




Wednesday, March 11, 2020

It's Time to Change

A pandemic it's not something that somebody wishes for. The administrative measures meant to slow down the spread of the new virus are based on common sense. A region where all are sick would produce many more victims and collateral damages than a quarantine.

The chain of events is now facing us with our new reality. For the first time in the history our globally connected communication systems are confronting us with the limitations of our resources and the pitfalls of taking for granted whatever information shared on the Internet.

The global warming and the effects of the chemical pollution are not yet evident for everybody, and it's possible to produce data sets and impacting presentations meant to deny them, but fever, pneumonia and the limited healthcare resources cannot go unnoticed.

The new virus threat has made aware numerous managers, clerks and teachers, that they already have the technology and tools they need for working remotely. Too many and too long meetings are counterproductive - that's one of the main messages of agile.

Business people have an unprecedented opportunity to learn more about the risks and drawbacks of long-distance supply chains, missing alternative sourcing, and in general how the economic slow-down of whichever macroregion is affecting the others.

The globalization's effects and consequences just cannot be swept under the carpet anymore. It's crystal clear that all the properties, productive capacities and inventions of the world are useless without proper knowledge and willingness to make use of them. 

The most valuable asset of our epoch is the applicable knowledge. The rapid growth of tuition costs during the last decades, the real estate bubble, the decline of the hedge funds, and now the restless stock market are all signaling the fact, that rules are changing, it's time to rethink many of the old recipes.

Somewhat similarly to our immune system, the social media has also started to develop antibodies capable to identify and trash the destructive information, but that's a long process, and we are only at its beginning.


Wednesday, January 1, 2020

Community Culture


This year we are ending a decade of profound transformations in the IT&C industry. On the one hand important software tools like .net Core and React have been listed under the MIT license, on the other hand well-known companies have reshaped their business models (Alphabet, Facebook) or have been acquired (Red Hat, Tableau etc.).

It looks like the big guys are diversifying their personnel and processes in order to offer better services, and are trying to adapt themselves to a rapidly changing economic context.

In the meantime a great number of IT startups are doing their best to attract classic investors,  crowd-funding or even venture capital, and stuck companies with experienced engineers are trying to reinvent themselves by taking over the clients switching from classic hosting to managed cloud hosting.

As a freelancer I'm up to date with the changing list of roles outsourced by companies in hi-tech countries. While in the first part of the decade numerous digital agencies and small telcos were looking for cheap, generalist workforce in order to preserve their competitiveness, for a couple of years the senior specialists (full stack developers, QA team leads, infrastructure engineers) have been more and more in demand.

This decade I've learned that the software tools, just like the hardware and all other products are evolving towards diversification and specialization. While 30-40 years before a software developer used to be considered really good when mastering a single programming language, nowadays the companies are looking for professionals with T-shaped skills.

This decade the biggest cloud hosting companies have learned that they have to offer multiple operating systems and tool sets, because the "Linux vs Windows" dispute is just ridiculous from the point of view of system integrators.

In a world where system software, programming and scripting languages are all just tools with their pros and cons, the project requirements are driving the right mix of hardware and software to be used.

Emotional debates about why a particular software is "good" or "bad" are missing the point: for what?  A startup with around 200 hits per hour does not always need a middleware with support for asynchronous operations, or a (No)SQL engine ready to scale horizontally.

When having a well modularized software project architecture, there is possible to change step by step the middleware, the database, the client apps.

The real problem is choosing between comparable tools like C# and F#,  Java and Scala, Python and Ruby etc. What I've learned during my 15+ years spent around open-source tools is that the community culture is what matters the most.

If you can identify yourself with the decisions, results and working style of the community leader(s), and you are feeling happy with the communication channels, then chances are that their project is a good fit for your product or service.

As always, while your job skills are enhanced, you are appreciated at work, by family and friends, the community is good for you. When you are experiencing persistent communication problems, infrequent updates, quality degradation, there are two main possibilities: try to change the negative trend to positive by contributing, or research and choose a different open-source project.

It can happen that your requirements have changed, or you have an opportunity to take your career to the next level - then again, be grateful for the good moments and for your growth, and follow your calling.



Saturday, November 30, 2019

Digital Nomads and Hourly Heroes


If telecommuting is your bread and butter, you already know the difference between these two ideatic figures dreamt by marketing and respectively business people.

Most laptop advertisings are depicting the digital nomad as an always worriless, jovial traveler selecting beautiful places for processing his or her to-do on the latest and greatest device.

The hourly heroes are expected to act as on-call gurus, experts, rock-stars or ninjas (you name it) in order to get things right, on-time and on-budget.

Daydreaming is not a bad thing per se, focusing from time to time on our goals and visualizing them are useful practices, but from wishful thinking to great achievements there is a never-ending iteration of working and learning processes. Along the way we learn which are the best and most important things for us, and how to achieve them.

Nowadays working online is rather a lifestyle than a temporary solution for youngsters and people between two jobs. The ubiquitous Internet supports us in acquiring the educational resources we need to reach one's potential, and then finding communities, where one's contributions are going to create added value.

The Internet has potentiated the development of a new, community-built culture, which is nurturing the democratization of the education.

For months we have been witnesses of more or less violent movements all over the world - many young people are out on the streets and protesting against various things, and often taking inappropriate actions.

The mass movements are always carrying the risk of being manipulated and diverted by bad guys. I trust that the new generation will learn their lesson well, and will manage to play down the segregationist practices, which are still affecting the education systems of multiple countries.



Sunday, October 27, 2019

A Farewell to SPAs


A medium-high complexity user interface includes many dozens of forms, and most of the time it has to manage big tables bound to highly interactive grid controls. In technical terms this does require creating, updating and deleting thousands of memory objects with high speed, and in a manner to keep the system responsive and protected from excessive memory fragmentation.

Pacifying the client device's responsivity with network round-trips and garbage collection cycles has always been a pain, and in this iron triangle of software architects the browser-based web applications are part of the problem rather than the solution.

The proliferation of the layered software architectures did not cure any major problem related to the user interfaces. Migrating a monolithic solution from WebForms to MVC does not enforce developers to sanitize the front-end.

Regardless of whether the UI forms are generated by the middleware or by client-side JavaScript, ignoring the fact that client-side memory management has low scalability is like shooting ourselves in the foot.

For a newcomer in web programming a SPA front-end sounds excellent, because by employing a JavaScript framework he or she is feeling able to offer great user experience. And he or she is right in case of software solutions which require very basic user interface.

In all the other cases the end-users are going to be faced with the infamous side effects of too many memory objects to be managed by the browser or even the local operating system.

I agree with the Z generation regarding the client-side form templating, but I want to see each complex form or table in a new browser tab, which is closed automatically as soon as possible. Nowadays each browser tab is running in its own process, and closing a tab is a very efficient way to clean up the memory garbage produced by the tab's owner process.

Instead of a SPA driven by a monolith I'm after a bunch of pages driven by services and microservices. Saying farewell to solutions based on a unique SPA is a "when" question, because growing client or server-side monoliths is invitation to problems.




Tuesday, August 20, 2019

What Should I Learn Next?

For IT workers this is THE million dollar question. I think most of us agree that keeping up with the latest technologies needs to be a recurring task in our calendars, but deciding on when to replace which of our software tools looks to me the most difficult problem faced by developers, architects and CIOs. 

Answering the question of what needs to be replaced next is driving the organization’s innovation and self-development processes towards realistic goals like leveraging the cloud technologies.

Moving into the cloud is a “when” question - these days running a VM does not cost more than a decent shared hosting package. And don’t be mad on your IT guy asking for your budget, ultimately that money amount decides what type of solution can you start from.

Let’s face it: since the advent of the Internet (token-based, routed, non-deterministic communication channel) and the dynamically evolving, diversified hardware and software, the reliability and maintainability of a given solution have become moving targets. The IT&C solution you are using is essentially a service, which needs maintenance, otherwise its quality and usability will  degrade in time.

Even if you are an end-user, you need to familiarize yourself with the changes in the software tools you are using, for you that’s the price of their evolution.

The owners of the “latest and greatest” software services have learned more or less on the hard way that the user needs have to be monitored and addressed as attentively as the ever-changing technical challenges. Beyond mastering the industry-specific management processes someday they will learn that staying relevant in the IT&C business is neither about populating the company with micromanaged guys nor about acquiring innovative teams. The right guys keep asking and answering  “what should I learn next…” 

Wednesday, May 1, 2019

It's About The Data


As a computer science student I loved to do things in my way, and now, after more than 30 years I'm still happy to express myself by elaborating personal projects, even if I consider very important to respect customers by offering them great user experience, and my engineer colleagues by valuing their processes.

With a BCS obtained in the 80s in a Comecon country, over the years I've had to dig my way out of improper tools and ignorance - just like most of my peers.

Due to life circumstances I've learned listening users attentively  and testing my code accurately long time before having the opportunity to join a team of engineers and working for the market. That time I knew nothing about processes, practices and habits, but as a synthetic thinker I was fascinated by the architecture of their product, one of the few integrated software packages available that time in my country.

Given that for decades capital deficiency has been one of the essential problems of my geographic zone, I've tried to do my best to meet people, who have been interested in attracting investors from other countries.

The global spread of the low-cost Internet has made  possible to millions of people like-minded to me to start telecommuting and learning proactively, not only following instructions. That was the moment when I realized that the knowledge of searching, filtering and analyzing data is one of the key skills for managing ourselves in the 21st century.

Then I've experienced the importance of the "learning by doing" method for developing and testing software, and I've taken courses for having a better understanding of the big picture: SDLC types and agile frameworks like scrum, lean and recently DevOps.

Taking a well-elaborated DevOps course has been of great help to me in understanding the importance of doing software development and maintenance in small steps, driven by hypothesis and experiments supported by data collected in production.

Filling data stores with numbers resulting from continuous monitoring or each visitor's route on a website is not about dehumanizing the relationship with users - listening the user voice (by the means of support, social networks and surveys) is also essential for gathering data in order to make the right decisions at the right time.

DevOps practitioners consider that starting from the lean practices and making use of automation, data mining, experimentation and continuous learning in rapid iterations is more efficient for navigating our globalized data lake, than prioritizing backlog items based on 1-2 person's opinions.