Podcasts about basel ii

  • 13PODCASTS
  • 15EPISODES
  • 26mAVG DURATION
  • ?INFREQUENT EPISODES
  • Aug 27, 2024LATEST

POPULARITY

20172018201920202021202220232024

Related Topics:

operational

Best podcasts about basel ii

Latest podcast episodes about basel ii

Reliability Matters
Episode 150: The Truth Behind Statistics

Reliability Matters

Play Episode Listen Later Aug 27, 2024 60:17


Statistics, at its core, is the science of making sense of data. From predicting trends and making informed decisions to ensuring quality control and optimizing processes, the applications of statistics are vast and varied. In the electronic assembly industry, where precision and reliability are paramount, statistical techniques become indispensable tools for engineers, manufacturers, and quality assurance professionals alike.Join us as we unravel the complex yet captivating connections between statistics and the truth. We'll delve into real-world case studies, and uncover the statistical principles that ensure the decisions we make every day are based on facts, accurate data, and accurate statistics. In today's episode, we're also going to tackle some common myths associated with statistics and shed light on how misinterpretation of data can lead to false conclusions. Many people think of statistics as infallible, a definitive answer to every question posed by data. However, this couldn't be further from the truth. Statistics is a powerful tool, but its effectiveness hinges on proper application and interpretation.We'll discuss myths such as "Correlation equals causation," where the mere relationship between two variables is often mistaken for one causing the other. We'll also address the misconception that a larger sample size always guarantees accurate results, and how ignoring the context or the source of data can lead to misleading outcomes.Moreover, we'll explore real-world examples where statistical missteps have led to costly errors and how these pitfalls can be avoided through rigorous analysis and critical thinking. By understanding these common misconceptions and learning how to approach data critically, you'll be better equipped to harness the true power of statistics.My guest today, is Aaron Brown. Aaron teaches statistics at New York University and at the University of California at San Diego, and he writes regular columns for Bloomberg and Wilmott. In the late 1980s and early 1990s, he was a key participant in developing modern financial risk management and one of the original developers of Value-at-Risk. He also helped develop the rules that eventually became known as Basel II.Aaron holds an M.B.A. in Finance and Statistics from the University of Chicago and an BS in Applied Mathematics from Harvard.

FinPod
What's New CFI: Basel III and Risk Management

FinPod

Play Episode Listen Later Jun 4, 2024 10:43


In this episode of FinPod, hosts Asim Khan and Ryan Spendelow dive deep into the world of banking regulations. Join them as they discuss the intricacies of Basel III and its significance in risk management. Unravel the historical context behind the Basel Accords and understand how Basel III emerged as a response to the 2008 global financial crisis.Asim and Ryan discuss the critical differences between Basel II and Basel III, from strengthened capital requirements to the introduction of liquidity ratios. Tune in to equip yourself with the knowledge and tools necessary to thrive in an ever-changing regulatory environment. Don't miss this opportunity to stay ahead of the curve and enhance your understanding of Basel III and its implications for the future of banking

London Fintech Podcast
LFP248 – The Challenges of FS/Fintech Boards esp in the 21stC w/Neil Holden Serial FS NED

London Fintech Podcast

Play Episode Listen Later Apr 4, 2024 54:48


The Governance of FS as a whole affects all of our lives as we saw in 2008 let alone in many individual examples of FS/Fintech businesses failing and losing clients money. In our digital age virtually all of our money is digital and next to no-one sees a share certificate any more so all of our investments are digital. Thus what might appear to be a refined topic for a few is at the heart of all of our lives – money being used for roughly everything and savings equating to money saved for tomorrow. As a result of this central role of money in economies, Boards in FS have always had to be – lets put this mildly – “rather more careful” than Boards of companies which if they fell over would “domino” not very far. Not only do many FS/Fintechs “take care of” clients money but money itself in the current financial model is created out of thin air by banks. Add to these long standing issues the ever-increasing burden on Boards as a whole from the always metastasising Regulatory State, ever-greater minutiae in FS per se and the State using companies to enforce social agendas which even a decade ago would have seemed unlikely/bizarre and we have a very challenging cocktail. Neil has 20 years expe4rience on Boards and Chairing sub-committees of FS and Fintech Boards and is thus well-placed to guide us through this labyrinth. Topics discussed include: Scottish and Irish traditional folk music transitioning from classical piano to the accordion interesting anomalies re accordions – not least of which one has to fill in the bass part oneself musical insights into Scottish vs Irish music learning music as “play these notes” vs “play along with us” a lot of folk musicians cannot read music Neil's career journey the long delay between Basel II and Boards getting with it the felt sense of FS Boards vs “normal” companies – cf FS and Sainsburys looking after other peoples money – fiduciary duty even if you are the equivalent of a corner shop in FS/Fintech you have to have as much governance as the equivalent of Sainsburys – which in retail obviously does not apply the need to cope with all that overhead when your Fintech is still very small Case Study of innovation/Fintech/regulation Banking Alchemy – a broken model of money (see eg LFP085, LFP197, LFP220 and LFP231) leads to a broken banking system which cannot be repaired by any governance as have discussed in many episodes a deep dive into unitary Boards vs sub-Committees of Boards what has driven this the strange situation re independent directors and sub committees where excess do most of the talking yet are technically no on the committee – “it is slightly absurd” the volume of “left-brain” stuff heaped upon FS/Fintech Boards absolutely swamps the Board meeting crowding out creativity in governance and business strategy a comparison of FS vs Airline regulation literally no limit to the amount of over-regulation of FS ever-increasing costs to the consumer as a result and higher risk taking as more people in FS/Fintech are doing non-revenue related activities management by walking around starts failing in the larger organisation where one is more dependent on management perception/spin comparison with quantum world – what is observed by the Board changes as a result of being observed the need to get a feel beyond the management information what are the powers of a NED individually or in a committee the challenge of balancing the need to gain insight and challenge yet not annoy/rile the executives the pressure on the Chairman in these circs “poking your nose in but not poking your fingers in” as summing up the role of the NED 21stC challenges from the whole tsunami of Corporate Governance (and the word literally did not exist from Chaucer up to the 1970s) and “ESG” et al the problem of a “free market” solution to governance and the problem of “bureaucratic rules” solution to governance NSDAP (“the Nazis”) as the inventors of Corporate Governance in 1930s Germany again – cf now? – to impose their political agendas upon companies the abuse of Governance Boards and “guidance” re ESG/DIE etc – the huge pressure to conform to something which is not statutory the imposition of groupthink by regulators yet regulators say groupthink is the biggest risk for Boards… the idea of safetyism and “nothing should ever go wrong” “there is no real difference ultimately between corporate control and corporate creativity” [as the longer term risk is the world is changing and the company does not react appropriately] the “war on meritocracy” the major risk of FS/Fintech Boards is groupthink and agreement re the wisdom of the rules/codes/guidance etc which is imposed upon Companies from outside the Boardroom Advisory Boards, Strategy days that are not minuted as important in these circs to have “unruled” conversations “Board Champions” as the latest pressure from regulators which break down the whole idea of collective responsibility and power of sub-committees Neil's overall advice to FS/Fintech Board members especially younger folk embarking upon their first gigs the importance of human dynamics within the Board – key cruxes is it a good trade being an FS/Fintech Director? non-commercial rewards in the role Neil's shoutouts for both folk music-ing and his own NED role portfolio state right now the Mutual Model of FS as an appealing one And much much more

Open at Intel
Software Supply Chains

Open at Intel

Play Episode Listen Later Mar 8, 2023 45:55


Marcela Melara, a research scientist in the security and Privacy Research Group at Intel Labs, and Bruno Domingues, a chief technology officer in the financial services industry practice and a SLSA project contributor share their deep knowledge about software supply chain Security, a subject on everyone's minds today. Guests: Dr. Marcela Melara is a research scientist in the Security and Privacy Group at Intel Labs. Her current work focuses on developing solutions for high-integrity software supply chains and building trustworthy distributed systems. She has several publications and patents filed related to her research, and leads a number of internal, academic and open-source efforts on software supply chain security. Prior to joining Intel, she received her PhD in Computer Science from Princeton University and did her undergraduate studies at Hobart and William Smith Colleges. She is a Siebel Scholar, a member of Phi Beta Kappa, and her research on CONIKS was awarded the Caspar Bowden PET Award. Outside of work, Marcela is an avid gardener, bookworm, hiker, and gamer. Bruno Domingues is a Chief Technology Officer in Financial Services Industry practice (SMG), where he is responsible for technical direction and pathfinding across Intel's product portfolio. He serves as the champion for Digital Transformation in the Financial Services domain. Before joining Intel in 2007, Bruno worked with Microsoft. He was a pioneer in the FSI vertical practice back in the '90s and developed a rich ecosystem of partners around the Microsoft platform to solve the most challenging industry's problems. With over 23 years of experience in applying technologies. Bruno developed a deep understanding of the financial industry mojo: Have worked with regulators to help banks ramp up on Basel II and III, architected mission-critical trading-desks operation, and inter-banking national-wide online payment system in different markets and regions in the World. In the last 15 years, Bruno has been focused on cloud adoption in the financial services industry, as it is a unique industry with unique requirements. Bruno also has served as IEEE Computer Society chairman (R9), Academic Liaison Director with CMG, and Board Advisor for Fintechs.  

VOV - Kinh tế Tài chính
VOV - Trước giờ mở cửa: 13 ngân hàng hoàn tất tiêu chuẩn hiệp ước Basel II

VOV - Kinh tế Tài chính

Play Episode Listen Later May 20, 2021 4:36


- Ngân hàng Nhà nước sắp ban hành loạt văn bản mới đẩy mạnh chuyển đổi số. - 13 ngân hàng hoàn tất tiêu chuẩn hiệp ước Basel II. - Công ty chứng khoán VIX ngược dòng cổ phiếu chứng khoán cùng những nhận định thị trường hàng hóa thế giới. Chủ đề : 13 ngân hàng, tiêu chuẩn hiệp ước Basel II --- Support this podcast: https://anchor.fm/vov1kd/support

nh ng vix basel ii
VOV - Kinh tế Tài chính
VOV - Trước giờ mở cửa: Hơn 350.000 tỷ đồng bổ sung ngân sách cho các địa phương

VOV - Kinh tế Tài chính

Play Episode Listen Later Nov 16, 2020 4:37


- Hơn 350.000 tỷ đồng bổ sung ngân sách cho các địa phương. - Đầu tư trực tiếp nước ngoài tăng trong tháng 10. - Ngân hàng Bản Việt hoàn thành 3 trụ cột Basel II trước hạn. - Nhận định thị trường hàng hóa thế giới. --- Support this podcast: https://anchor.fm/vov1kd/support

sung nh ng basel ii
The FS Club Podcast
Meeting Bank of England, Prudential Regulation Authority & Financial Conduct Authority Demands For Operational Resilience - Bringing Together Multiple Disciplines

The FS Club Podcast

Play Episode Listen Later Nov 11, 2020 47:53


Find out more on our website: https://bit.ly/3qDdDYf In this webinar, Anita and Palvinder seek to: Put the incoming UK operational resilience regime in context and provide a view of its key elements; Give a flavor of BOE/PRA/FCA expectations; and Highlight why this framework is multidisciplinary and requires a change in perspective. Speakers: Anita Millar is a risk and public policy professional with over 20-years of experience working in the financial services sector. Her career spans risk management, audit, and consulting on regulatory change. She has held roles at ISDA, HSBC, Insight Investment, AFME, Northern Trust, and RBS. Her regulatory work spans cross-border regulation, Basel II (credit & operational risk), Basel III (capital & liquidity), MiFIR, EMIR, EU Banking Union and a proposed EU Financial Transaction Tax (FTT). She has also written research reports for City of London Corporation/TheCityUK and Invest Europe (previously EVCA). Palvinder Gill is a senior prudential and risk practitioner having held roles at Credit Suisse, Macquarie Group, Morgan Stanley, Nomura and Wells Fargo. He has also worked as a policy maker at the Bank of England and FSA and as a regulatory consultant at EY. His regulatory and risk prudential capital and liquidity projects including Basel II, Basel III, ICAAPs, ILAAPs, Recovery Plans, FRTB, IFR/IFD, Pillar 3 Disclosures and TCFD Climate disclosures.

Finance & Fury Podcast
Why do banks seem to have the ability to lend never ending amounts of money?

Finance & Fury Podcast

Play Episode Listen Later Sep 18, 2020 19:53


Welcome to Finance and Fury, the Furious Friday edition. Today – discuss the topic of banking policy changes and how this opened the gates for the potential of never-ending money supply in the modern banking system To start with – look at How does money get lent out in Australia? Well – by a bank of course – you go to a bank to borrow money but what are they allowed to lend around? Well in basic economics – banks are treated as a financial intermediary – their role in a traditional sense is to connect savers to borrowers – they act as the middleman So a saver with surplus cash will put it into the bank – the bank will then use this as a reserve and lend out based around this Under this situation – a banks ability to lend is limited by how much they have of their customers savings – which act as the deposits Because in order to lend more money – they need more depositors – no depositors – no loans However – this theory is based around what is known as fractional reserve banking – where a commercial bank has a set reserve requirement and will lend out at a multiple of those reserves The classification of reserves was expanded upon over time – in addition to depositors funds - had treasury bonds and deposits at the RBA – but depending on monetary policy – lending could be limited As an example – say the reserve requirement is 10% - then the multiplier is 10 times – if the bank has $1m of deposits they can lend out $10m deposits - But this concept is rather misleading in the modern era of banking – I mentioned in Weds episode that Australia does not have an official fractional reserve banking system This was abolished when we brought in the Basel standards – ‘Basel I’ – which was implemented in 1988 Central to the design of the Basel capital standards is the idea that a bank should hold capital in relation to its likelihood of incurring losses In the modern era - A bank's capital simply represents its ability to withstand losses without becoming insolvent Hence – a capital adequacy requirement is set – monitored and regulated by APRA based around guidelines set by the BIS using the Basel standards I do see one reason why there was a need for this movement away from the reserve requirements – In the modern economy where deposit accounts are insured by governments – it is likely that banks would have found it tempting to take undue risks in their lending operations since the government insures deposit accounts So these regulatory capital requirements have at least removed this moral hazard But it has opened up the floodgates for lending – and skewed the traditional incentives of lending – so let’s look at it further   How does the Capital adequacy requirements work? First – look at the capital that has replaced depositors’ funds as the reserve requirement – these are broken down into Tier 1 and Tier 2 capital – where the sum of these two make up the reserve requirements - net of any deductions on the banks balance sheets Tier 1 – Tier 1 capital consists of the funding sources to which a bank can most freely allocate losses without triggering bankruptcy – essentially - assets that can be liquidated (sold), written down or converted to cover losses quicky – hence it avoids a bankruptcy – includes: ordinary shares in the bank and retained earnings that the bank has on its balance sheet - makes up most of the Tier 1 capital held by Australian banks – But tier 1 capital also includes specific types of preference shares and convertible securities – such as capital notes – Convertible securities, for example, were included in the Basel II definition of Tier 1 capital on the premise that banks would exercise their option to convert them into common equity whenever additional capital was needed. however - since it is more difficult for banks to allocate losses to these instruments - APRA set a limit of 25% of Tier 1 capital being allowed in this form The APRA requirements set are 10.5% for the capital adequacy requirement – or 10.5% of its risk-weighted credit exposures – the loans that may not be able to be repaid Tier 2 – considered to be less liquid or convertible than tier 1 - in many some cases they may only be effective at absorbing losses when a bank is being wound up provides depositors with an additional layer of loss protection after a bank's Tier 1 capital is exhausted - primarily consists of subordinated debt - though it also comes in other varieties Both Tier 1 and Tier 2 capital are measured net of deductions This is an adjustments due to the way accounting measures are treated – sometimes the banks will have forms of equity used to balance their holdings of intangible assets – things like goodwill – so if a bank is going to go bankrupt – this loses all of its value Secondly – have to measure the risks that this capital requirement is set against - For capital adequacy purposes, Australian banks are required to quantify their credit, market and operational risks most significant of these risks is credit default risk – or bad loans emerging from people defaulting on their loans – which is part of a banks traditional lending activities This credit risk is measured as the risk-weighted sum of a bank's individual credit exposures, which gives rise to a metric called ‘risk-weighted assets’ Standardised approach for these risk weights are prescribed by APRA for smaller banks - based on the risks of default and other characteristics of each loan the bank is exposed to For example – take one residential mortgage – if it has a loan-to-valuation ratio of 70%, no mortgage insurance and the borrowers are managing to make repayments - APRA specifies a risk weight of 35% - so for every $100 of outstanding debt – the risk-weighted asset would be $35 However –the risk weight for corporate (business) loans is 100% - For the big 4 – they use an alternative Internal Ratings-based approach whereby risk weights are derived from their own estimates of each exposure's probability of default – so the bank can set the limits for the risk weight against each loan Where does the market currently stand – Banks have been busy – the amount of capital held by the Australian banking system has been increasing – rather rapidly since 2014 – went from a capital adequacy ratio of 12% to 16.3% in June – this is a combination of Tier 1 and Tier 2 capital The rise in the banking system's Tier 1 capital mostly reflects a large amount of new equity in the form of share issuances as well as capital notes that have been issued to the market Covered this as part of the bail in topic a while back – but the banking system has been preparing for some downturn in loans for some time Over time – it was also through dividend reinvestment plans occurring over the years the banks Tier 1 capital has been growing – up until recently Also – with a lot of banks cutting back on dividends – their retained earnings have also boosted the Tier 1 capital more than the reinvestment of dividends normally would Another major trend over the years – thanks to recommendations from the Basel Standard – lend more to households over businesses – that way your risk There has been a large shift in the composition of banks' loan portfolios towards housing lending - attracts much lower risk weights than business and personal lending Reversal in lending trends – Busines loans used to make up the lion share – in 1990s – Housing accounted for about 25% business loans about 65% - today these are reversing – It makes sense from a risk weighted asset point of view - As an example - The RBA released a paper back in 2010 - $3.9 trillion of lending by the banks with all kinds of loans – based around these risk weighted methodologies – there was $1.2 trillion in credit risk-weighted assets – then $2.7 trillion was unweighted assets Within the risk-weighted total, corporate exposures account for $370 billion, while residential mortgage exposures are lower at around $300 billion, reflecting their relatively lower risk weights To expand this example further – on the $1.2 trillion in RWA – banks would need about $126bn by todays standards in Tier 1 capital Bit of a side note – but was interesting reading a paper from the RBA back in 2010 – was talking about the forthcoming regulatory developments that are now in place – Increase the quality, international consistency and transparency of the capital base - This includes enhancing a bank's capacity to absorb losses on a going concern basis, such that more of its Tier 1 capital is in the form of common shares and retained earnings – which has occurred with massive capital raisings in shares of the banks over the years Ensure that even if a failed or failing bank is rescued through a public-sector capital injection, all of its capital instruments are capable of absorbing losses. This includes a requirement that the contractual terms of capital instruments allow them to be written off or converted into common equity if a bank is unable to support itself in the private market – which has been achieved by the capital notes which are convertible and form part of the bail in legislation   So what really affects banks’ ability to lend? if bank lending is not restricted by the reserve requirement then do banks face any constraint at all? As we have seen – it isn’t the reserve requirements – looking at the household debt to GDP over the years – back when it was constrained by deposits and central bank reserves – struggled to get over 40% of household debt to GDP – after these requirements were removed – started to rise by quite a bit – by 2008 was about 110% - today is about 120% - so it has slowed over the past 10 years – but still second highest in the world But – they have to keep their capital adequacy in line with the minimum requirements – however this is rather subjective – in essence – banks are only constrained by three factors First – you have the demand for loans - banks base their lending decisions on their perception of the risk-return trade-offs – so as long as there are consumers out there with the deposit requirements (or existing equity in property) and the incomes needed to service the loans based around their lending standards- then the banks will lend There has been no shortage of demand – property markets have been a competitive environment – and with lowering interest rates – the amount people can afford by the borrower in the banks eyes (especially since the benchmark for the serving got dropped over a year ago) has goes up dramatically Second – the amount of Tier 1 capital they can raise – the sequence of how this works in practice is that it works in opposite direction of what most people would think – in reality - banks first make their lending decisions (lend the money out) and then go looking for the necessary capital through issuing it to the market to make sure they remain within the requirements Finally – the measurements of the risk weighted assets – which is a nominal establishment of how much per loan is consider risky – for example – 35% of a home loan And since the capital requirements are specified as a ratio whose denominator consists of risk-weighted assets (RWAs) – the level of capital that needs to be retained is dependent on how the risk is measured in turn – this level – say the 35% is dependent on the subjective nature of human judgment – and any subjective judgment from coming from regulators with close ties to those who work for the banks that they regulate – sometimes comes with the ever-increasing profit desire - which may lead the financial system down the road of underestimating the riskiness of their assets – especially in situation with bubbles in asset prices In summary – If bank lending is constrained by anything at all, it is how much tier 1 capital they can raise as well as how much the population can afford to borrow But the changes from 1988 has created a situation where banks were adapting to the changes in the monetary systems around the world – lending in a fiat world In reality – why wouldn’t the banking system do this? The reserve requirements were the foundation to banking under the gold standard – but under the fiat system where money can be created out of thin air – as long as there is somewhere to soak it up – such as the property market through additional mortgages – why wouldn’t the bank continue to lend as much as possible? Loans to them are assets – so the more they can lend – the more money they can make But I hope this episode helps to explain how the modern banking system works Thank you for listening to today's episode. If you want to get in contact you can do so here: http://financeandfury.com.au/contact/

Dinis Guarda citiesabc openbusinesscouncil Thought Leadership Interviews
citiesabc interview: Jorge Sebastiao - Tech Thought Leadership, Smart Cities and Cybersecurity

Dinis Guarda citiesabc openbusinesscouncil Thought Leadership Interviews

Play Episode Listen Later May 14, 2020 49:57


Jorge Sebastiao is a technology thought leader, influencer and ICT expert in the area of cybersecurity with more than 35 years of experience in technology systems. In this new interview for citiesabc, carried out by Dinis Guarda, cybersecurity is a central theme as every digital system exposes new - and unexpected, risks. That is especially worrisome in Smart Cities projects, as Jorge Sebastiao points out. From that perspective, both experts speak about technology adoption in the Middle East and in the World, the Covid-19 pandemic and the development of emerging technologies like blockchain, AI or the roll-out of 5G.Structure1. Biography, education, 2. Professional career as ICT expert, CTO, Huawei until now3. Challenges and opportunities working in different cultures and countries. 4. Middle East experience5. Evolution of technology and its impact6. Cybersecurity: threats, solutions and emerging technologiesSmart Cities7. Covid-19 pandemic: how technology can help us fight the crisis backJorge is a seasoned ICT Expert in areas of Cyber Security, Artificial Intelligence, Blockchain, Cloud Computing & managed services focused on business value. He brings experience, creativity, structure and innovation to the solutions he architects through ICT infrastructure. With over 30 years of ICT experience, covering C level on Cloud Computing, AI, Blockchain, Cyber Security, Physical Security, Managed Services, managed security services, business continuity and disaster recovery, as well as governance, risk management, compliance, auditing, certification. Served sectors include aviation, transportation, oil & gas, banking, financial, telecom, government, defense, healthcare, and education. Jorge created the process A6 of security: Assess, Architect, Apply, Administer, Awareness & Agility. He architects practical & business focused Cloud and Security solutions using standards and industry best practices.He has been the speaker at numerous international conferences with topics including: ICS Security, ICT Security, Business Continuity, ICT Architecture (TOGAF), Wireless Security, Regulatory Compliance, Physical Security, Biometrics and Smart Cards, Ethical Hacking, ISO27001, ISO27002 (ISMS), ISO2000 (ITIL), ISO13335 (Risk Management), ISO22301 (BCMS), CobIT, Basel II, EMV-2, PCI-compliance, SOX, Info warfare, Fraud, Identity Theft, SAP-Security, CISSP, Forensics & Incidence Response.JORGE SEBASTIAO LINKEDIN: https://www.linkedin.com/in/sebastiao/JORGE SEBASTIAO TWITTER: https://twitter.com/4jorgehttps://www.citiesabc.com/https://twitter.com/citiesabc_https://www.instagram.com/citiesabc/

The Risk Management Association

Joseph Breeden, Founder and CEO, Prescient Models, discusses how existing models can converge or be modified to adapt and fill different needs, specifically regarding Basel II for capital reserves, CCAR/DFAST programs for stress testing, and IFRS9 and CECL for ALLL.

Contemporary Issues in Finance - Audio
Managing financial risks

Contemporary Issues in Finance - Audio

Play Episode Listen Later Jul 15, 2009 8:17


The various financial risks associated with businesses and how they can be managed.

Contemporary Issues in Finance - Audio
Transcript -- Managing financial risks

Contemporary Issues in Finance - Audio

Play Episode Listen Later Jul 15, 2009


Transcript -- The various financial risks associated with businesses and how they can be managed.

Der wuapaa-Businesspeople-Podcast
wuapaa | Marco Wilkens: Sind Sie kreditwürdig?

Der wuapaa-Businesspeople-Podcast

Play Episode Listen Later Feb 1, 2008 2:36


Basel II entscheidet seit einem Jahr über die Kreditfähigkeit einer Firma. Doch was passiert, wenn das Rating schlecht ist?

Volkswirtschaftliche Fakultät - Digitale Hochschulschriften der LMU
Three Empirical Essays on House Prices in the Euro Area

Volkswirtschaftliche Fakultät - Digitale Hochschulschriften der LMU

Play Episode Listen Later Jan 23, 2007


The real euro area house price has increased steadily since 1997 reaching more than 6.5% in 2004 and 2005. The rise on its own is not as striking as the long lasting effect of the phenomenon. Indeed, it is the longest lasting house price increase ever experience in the euro area since data are available. Due to a lack of data, I use panel econometrics. The estimated coefficients are then used to generate euro area fitted values in all three papers. In my first paper, I investigate the question whether we can explain past house price movements by the arbitrage theory. Since arbitrage is a static phenomenon, I use within FE and RE estimators. The pattern of the residuals proves that arbitrage is not the core mechanism in house price determination. This may be due to large transaction costs, but also housing is un-tradable by nature, and finally government regulations. Alternative explanations are, friendly tax schemes to promote home ownership against alternative assets, government regulations like tenants rights, and finally massive public investment to build social houses for small income households. Instead, in my second paper, I suggest to analyze directly the interaction of supply and demand together with the mortgage market. Due to endogeneity, Anderson & Hsiao (1981) and the GMM Arellano-Bond (1991) dynamic panel estimator (AB) are used. However, in macro-panel data, the time dimension is much larger than the cross-sectional dimension. A trade-off arises between efficiency and consistency. Thus, the results are checked by means of LSDV (Least Square Dummy Variables) estimates, which are indeed close to the AB estimates. In the third paper, a general equilibrium model underpins the choice of macro-variables. To disentangle long-term from short-term dynamics, the PMG (Pooled Mean Group) estimator developed by Pesaran, Shin & Smith (1997) is used. The PMG assumes homogeneity of long-run coefficients (or a sub-set) but without making implausible assumption of common short-term coefficients. In the short-term coefficients are allowed to vary across countries. Indeed, mortgage and house market in the euro area are characterized to be strongly heterogeneous. The conclusions on the short-term dynamics of the house prices are alike between the second and the third paper. As regards the factors: disposable income per capita and the autoregressive term mostly drive the house price dynamics. Moreover, the demographic variables and rents do not account in the short term house price dynamics. The conclusion on the long-term house price are mainly explained in the third paper. The empirical results of the Panel ECM suggest a strong long-term empirical relationship between house price and disposable income, interest rate, stock of dwelling, population, but also mortgage loan. However, long-term house price equilibrium is mainly driven by disposable incomes and interest rates. How can we explain the long lasting house price increase in the euro area? First, in the wake of the monetary union process, household’s in the euro area experienced a positive shift in their borrowing capacity which had a positive impact on house price dynamics. Second, the euro area business cycle since 1999 has been stabilized by means of an optimal leaning against the wind policy which has no pedigree in the euro area. The slowdown of the economy since 2001 has been offset by an accommodative policy. Monetary policy stance indicators like interest rate and mortgage loan development prove a leaning against the wind policy which sustained strong housing demand. Excess demand factor indicates that sticky supply did not react quickly enough, the overshooting of demand kept soaring house prices. Current house price cycle is above 2% since 1998, i.e. already 8 years (10 years with our adjustment process), the longest lasting cycle ever experienced. Thus, two effects explain this fact. The low interest rate in level due to the gained credibility and variation in interest rate (opticmal monetary policy). In contrast, the previous cycle duration was much shorter, i.e. from the mid-80s to the mid-90s. As inflation was already rising in the end of the 80s, the Bundesbank raised its discount rate in 1988, and kept tightening it until 1992 due to the German monetary reunification. The one to one exchange with the East-mark obliged the Bundesbank to lead an even harsher policy. The other member countries of the ERM (European Ex-change Rate Mechanism) had to follow at odds with an optimal monetary policy. Consequently, the “euro area” interest rates increased dramatically. Beyond the cyclical component, the overall interest rate level was excessively high. The financial liberalization in the mid-80s caused sharp real estate increases. Thereafter, high interest rates coupled with a weak business cycle dampened relatively quickly soaring house prices. Currently, the ECB reputation allowed a structural real and nominal real interest rate fall, as well as an optimal monetary policy in view of the business cycle fluctuations. This explains why current house price cycle already lasts twice as long as the previous one. As regards the mortgage market (second paper), collateral (house price) is the only core factor. Consequently, house market and mortgage market strongly interact via the collateral. Banks relax their lending standards by favourable house price prospects due to asymmetric information. Households can capture more mortgage loans which fuels demand. Higher house price impinges positively on household’s wealth. This self-perpetuating process is then reversed by a trigger event like monetary policy tightening. An interest rate increase of 1% causes a 1% house price inflation drop and a 0.4% decline in mortgage loan growth rate in the long term, according to the estimates. The interest rate shock has only a temporary effect on the mortgage market on its own but the collateral (house price) drop leads to a long lasting fall in mortgage loan volume. To conclude, interest rate increase impinges negatively on real house price growth, proving that demand outweighs supply. As a result, monetary can influence house price growth. The estimates of the euro area house price equilibrium (third paper) prove three positive misalignments with respect to actual house prices. The first started during the second oil price shock until the mid 80’s. The second began in the late 80’s and ended in the mid90’s. Finally, current house prices has overshot equilibrium price since 2001 and has not shown mean reversion yet. This history is in line with the literature on housing. However, the gap between house price equilibrium and current price cannot be assimilated to a bubble as defined by Stiglitz (1990). The misalignment of current prices to long-term equilibrium price characterizes a natural feature of the functioning of the house market. Supposing the tide starts to turn in 2006, current house prices would smoothly catch up equilibrium price in 5 to 6 years according to the Panel ECM. In the simulation of current house prices adjustment to equilibrium level, I suppose that all explanatory variables equal simultaneously their steady value in 2006 and onwards. The adjustment depicts a 4% growth rate in 2006 and thereafter 2% and then a tiny negative rate of approximately 2 years until 0% (in real terms). This is a smooth and soft landing in opposite to the “biggest bubble in history” documented by the Economist for instance. This empirical study might prove that no recession will occur and even less a deflation. First, most of the huge house price increase in the euro area is explained by the fundamentals and second, the bank risk exposure is relatively moderate, they have mostly already implemented Basel II, which will be officially implemented in 2008.

Mathematik, Informatik und Statistik - Open Access LMU - Teil 02/03
Modeling dependencies between rating categories and their effects on prediction in a credit risk portfolio

Mathematik, Informatik und Statistik - Open Access LMU - Teil 02/03

Play Episode Listen Later Jan 1, 2006


The internal-ratings based Basel II approach increases the need for the development of more realistic default probability models. In this paper we follow the approach taken in McNeil and Wendin (2006) by constructing generalized linear mixed models for estimating default probabilities from annual data on companies with different credit ratings. The models considered, in contrast to McNeil and Wendin (2006), allow parsimonious parametric models to capture simultaneously dependencies of the default probabilities on time and credit ratings. Macro-economic variables can also be included. Estimation of all model parameters are facilitated with a Bayesian approach using Markov Chain Monte Carlo methods. Special emphasis is given to the investigation of predictive capabilities of the models considered. In particular predictable model specifications are used. The empirical study using default data from Standard and Poor gives evidence that the correlation between credit ratings further apart decreases and is higher than the one induced by the autoregressive time dynamics.