Getting to Zero!

Authorship

by on Jul.19, 2021, under About

Books on Strategy, Finance & Risk (based on data modeling)

The primary purpose of modeling is to be able to make good decisions. Having spent a decade in CMBS stress testing, in this monograph, I examine credit and default risk, that is, take it apart and deconstruct these concepts and show that defaults are easy to define but difficult to measure, but more importantly default risk cannot be used to determine credit risk. That pre-payers cannot be lumped together with defaulters as they belong to different population demographics. I then reconstruct the credit and default risk models to provide a much better model for forecasting securitized mortgage portfolio defaults. Vasicek is replaced by the Default Covered-Call Model which consists of the Sustainable Cash Flow Line, Lender ’s Default Function and the Borrower ’s Default Function. While the securitized mortgage industry addresses the Dodd-Frank risk-retention from a lenders perspective, from an investor’s perspective, Dodd-Frank is wanting. Therefore, Dodd-Frank is vague and needs much rework. Finally, a new bank product is proposed which will substantially reduce the impact of mortgage risk in a contracting economy. This is expected to reduce bank capital requirements.

The primary purpose for examining data is to be able to make good decisions. This requires the ability to build, not just a model, but a knowledge domain from the data. Data analysis is primarily concerned with how to extrapolate (out of sample?) data to provide business information. This book is about data modeling with examples from complex real-world data, expansion of the universe, Multiple Sclerosis, COVID-19, and the new Collated Distributions, Wilcoxon Regression and Levy Distribution. How does one construct the knowledge domain for a problem at hand from the data? It requires building the context structure which consists of several individual cooperating models. Determining these models requires an understanding of how to partition the data which is not the same as stratification or machine learning. The partitioned data consists of individual processes which are evidenced as different probability distributions. Beware, much of statistical testing requires that specific axiomatic requirements be met which are usually not. This monograph teaches by way of complex data examples how to read this complex data and why the data is expecting us to ask much more questions than we do.

In 2008, Wall St. crashed, and with it took down the US economy. The incoming Secretary of the Treasury, Tim Geithner, reported that the Great Recession wiped out $15 trillion in household wealth, lost 9 million jobs, caused 5 million homeowners to lose their homes, and brought 9 million Americans below the poverty line. What a legacy! How was this possible? Modern finance teaches us that unsystematic risk is fully diversified away when one constructs portfolios. Therefore, supposedly, how bankers, analyst and corporate managers behave will not affect the markets as the unsystematic risk they create is fully diversified away. Not correct. This book shows that unsystematic risk cannot be fully diversified away. By laying the groundwork for financial management which includes unsystematic risk, new analyses and tools are provided to quantitatively monitor equities markets, portfolios and risk scoring.

The Holistic Business Model identifies, in a structured manner, the 48 structural positions and 32 strategies your company can effect, resulting in 2 million variations in your company’s strategic environment. This complexity is handled by three layers, consisting of the Operations Layer, the Revenue Transaction Layer and the Business Management Layer. Strategy is the migration from one structural position to another in the Business Management Layer. Therefore, the Model prevents investors, business owners and corporate managers from making incorrect moves, while both, enabling them to see their future options, and enhancing the quality of their management decisions. The Operations Layer explains why lean manufacturing (JIT and Kanbans) works when it does, when it does not, and the important considerations when setting up a manufacturing operation using lessons learned from the semiconductor and Fast Moving Consumer Goods industries. The Revenue Transaction Layer identifies how your company generates its revenue. Based on 20+ years in manufacturing and management consulting in multinational, large, medium & small companies, Solomon invented the Holistic Business Model that only requires public information to determine your company’s and your competitors’ strategies. Four case studies are presented: a manufacturing operation, a home builder, a non-profit and a sea port.

Books of the Foundations of Physics (based on data modeling)

Since the discovery of the massless formula for gravitational acceleration g=τc^2 in 2007, this is the third book in the series rewriting the foundations of physics, This book covers the origins of gravitational fields, gravity modification engine design, black holes and their black particles. It provides an alternative explanation to the Universe expansion red shift data. Hand in hand with this is the rewrite of photon probabilities, how Nature implements probabilities, a better handle on randomness versus probabilities, probability control, probability as a field theory and how gravitational fields deform probability fields.

This book, Unifying Gravity with the Atomic Scale, proposes that a new physics exists. The findings are based on 18 years of extensive numerical modeling with empirical data, and therefore, both testable and irrefutable. In 2012 Prof. Nemiroff, using NASA’s Fermi Gamma-ray Space Telescope photographs, showed that quantum foam cannot exists. In the same year Solomon showed that both exotic matter and strings could not exists. In 2015 the Kavli Foundation, with Prof. Efstathiou, Prof. Pryke, Prof. Steinharrd discussed the issues with the Planck Space Telescope findings of a Universe that is significantly simpler than our theories. Therefore the need for new physics. The replacement of the Schrödinger Wave Function with the simpler Probabilistic Wave Function, results in a new electron shell model based on the Rydberg equation giving exact results with quantum mechanics; leading to a new Standard Model and the unification of photon shielding, transmission and invisibility as the same phenomenon. Solomon’s inference is that any current or future stealth technology can be neutralized.

An Introduction to Gravity Modification, Second Edition is the result of a 12-year (1999-2011) study into the theoretical and technological feasibility of gravity modification, that presents the new physics of forces by replacing relativistic, quantum and string theories with process models. Gravity, electromagnetism and mechanical forces are unified by Ni fields, and obey a common equation g = (tau)c^2.

Gravity modification is defined as the modification of the strength and direction of the gravitational acceleration without the use of mass as the primary source of this modification, in local space time. It consists of field modulation and field vectoring. Field modulation is the ability to attenuate or amplify a force field. Field vectoring is the ability to change the direction of this force field .

This book reaches out to a wider audience, and not just to the theoretical physicist; to engineers and technologist who have the funding to experiment; just as Arno Penzias and Robert Woodrow Wilson experimented with the Holmdel Horn Antenna and discovered the microwave background radiation. The mathematics is easier than that taught in theoretical physics and therefore accessible to a wider audience such as these engineers and technologists.

:

Leave a Reply

Looking for something?

Use the form below to search the site:

Still not finding what you're looking for? Drop a comment on a post or contact us so we can take care of it!

Blogroll

A few highly recommended websites...

    Archives

    All entries, chronologically...